You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/11/14 20:55:34 UTC

Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1533

See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1533/display/redirect>

Changes:


------------------------------------------
[...truncated 1.65 MB...]
19/11/14 20:55:28 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/14 20:55:28 INFO DAGScheduler: failed: Set()
19/11/14 20:55:28 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/14 20:55:28 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.4 GB)
19/11/14 20:55:28 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.9 KB, free 13.4 GB)
19/11/14 20:55:28 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:36273 (size: 22.9 KB, free: 13.4 GB)
19/11/14 20:55:28 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/14 20:55:28 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/14 20:55:28 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/14 20:55:28 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 159, localhost, executor driver, partition 1, NODE_LOCAL, 7760 bytes)
19/11/14 20:55:28 INFO Executor: Running task 1.0 in stage 132.0 (TID 159)
19/11/14 20:55:28 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/14 20:55:28 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/14 20:55:28 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestMgc2oe/job_3d8c8db3-b2da-4bba-8206-e0ed06e0325c/MANIFEST
19/11/14 20:55:28 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestMgc2oe/job_3d8c8db3-b2da-4bba-8206-e0ed06e0325c/MANIFEST -> 0 artifacts
19/11/14 20:55:29 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/14 20:55:29 INFO main: Logging handler created.
19/11/14 20:55:29 INFO start: Status HTTP server running at localhost:39841
19/11/14 20:55:29 INFO main: semi_persistent_directory: /tmp
19/11/14 20:55:29 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/14 20:55:29 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573764925.63_4a390ccc-d2fc-4e57-8dee-c1d2be126121', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/14 20:55:29 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573764925.63', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46901'}
19/11/14 20:55:29 INFO __init__: Creating state cache with size 0
19/11/14 20:55:29 INFO __init__: Creating insecure control channel for localhost:41103.
19/11/14 20:55:29 INFO __init__: Control channel established.
19/11/14 20:55:29 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/14 20:55:29 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/14 20:55:29 INFO create_state_handler: Creating insecure state channel for localhost:45551.
19/11/14 20:55:29 INFO create_state_handler: State channel established.
19/11/14 20:55:29 INFO create_data_channel: Creating client data channel for localhost:41983
19/11/14 20:55:29 INFO GrpcDataService: Beam Fn Data client connected.
19/11/14 20:55:29 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/14 20:55:29 INFO run: No more requests from control plane
19/11/14 20:55:29 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/14 20:55:29 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/14 20:55:29 INFO close: Closing all cached grpc data channels.
19/11/14 20:55:29 INFO close: Closing all cached gRPC state handlers.
19/11/14 20:55:29 INFO run: Done consuming work.
19/11/14 20:55:29 INFO main: Python sdk harness exiting.
19/11/14 20:55:29 INFO GrpcLoggingService: Logging client hanged up.
19/11/14 20:55:29 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/14 20:55:29 INFO Executor: Finished task 1.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/14 20:55:29 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 160, localhost, executor driver, partition 0, PROCESS_LOCAL, 7977 bytes)
19/11/14 20:55:29 INFO Executor: Running task 0.0 in stage 132.0 (TID 160)
19/11/14 20:55:29 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 159) in 926 ms on localhost (executor driver) (1/2)
19/11/14 20:55:29 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestMgc2oe/job_3d8c8db3-b2da-4bba-8206-e0ed06e0325c/MANIFEST
19/11/14 20:55:29 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestMgc2oe/job_3d8c8db3-b2da-4bba-8206-e0ed06e0325c/MANIFEST -> 0 artifacts
19/11/14 20:55:30 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/14 20:55:30 INFO main: Logging handler created.
19/11/14 20:55:30 INFO start: Status HTTP server running at localhost:37401
19/11/14 20:55:30 INFO main: semi_persistent_directory: /tmp
19/11/14 20:55:30 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/14 20:55:30 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573764925.63_4a390ccc-d2fc-4e57-8dee-c1d2be126121', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/14 20:55:30 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573764925.63', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46901'}
19/11/14 20:55:30 INFO __init__: Creating state cache with size 0
19/11/14 20:55:30 INFO __init__: Creating insecure control channel for localhost:41443.
19/11/14 20:55:30 INFO __init__: Control channel established.
19/11/14 20:55:30 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/14 20:55:30 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/14 20:55:30 INFO create_state_handler: Creating insecure state channel for localhost:45999.
19/11/14 20:55:30 INFO create_state_handler: State channel established.
19/11/14 20:55:30 INFO create_data_channel: Creating client data channel for localhost:36145
19/11/14 20:55:30 INFO GrpcDataService: Beam Fn Data client connected.
19/11/14 20:55:30 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/14 20:55:30 INFO run: No more requests from control plane
19/11/14 20:55:30 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/14 20:55:30 INFO close: Closing all cached grpc data channels.
19/11/14 20:55:30 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/14 20:55:30 INFO close: Closing all cached gRPC state handlers.
19/11/14 20:55:30 INFO run: Done consuming work.
19/11/14 20:55:30 INFO main: Python sdk harness exiting.
19/11/14 20:55:30 INFO GrpcLoggingService: Logging client hanged up.
19/11/14 20:55:30 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/14 20:55:30 INFO Executor: Finished task 0.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/14 20:55:30 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 160) in 872 ms on localhost (executor driver) (2/2)
19/11/14 20:55:30 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/14 20:55:30 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.804 s
19/11/14 20:55:30 INFO DAGScheduler: looking for newly runnable stages
19/11/14 20:55:30 INFO DAGScheduler: running: Set()
19/11/14 20:55:30 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/14 20:55:30 INFO DAGScheduler: failed: Set()
19/11/14 20:55:30 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/14 20:55:30 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.4 GB)
19/11/14 20:55:30 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.4 KB, free 13.4 GB)
19/11/14 20:55:30 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:36273 (size: 12.4 KB, free: 13.4 GB)
19/11/14 20:55:30 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/14 20:55:30 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/14 20:55:30 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/14 20:55:30 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/14 20:55:30 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/14 20:55:30 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/14 20:55:30 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/14 20:55:30 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestMgc2oe/job_3d8c8db3-b2da-4bba-8206-e0ed06e0325c/MANIFEST
19/11/14 20:55:30 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestMgc2oe/job_3d8c8db3-b2da-4bba-8206-e0ed06e0325c/MANIFEST -> 0 artifacts
19/11/14 20:55:31 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/14 20:55:31 INFO main: Logging handler created.
19/11/14 20:55:31 INFO start: Status HTTP server running at localhost:34449
19/11/14 20:55:31 INFO main: semi_persistent_directory: /tmp
19/11/14 20:55:31 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/14 20:55:31 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573764925.63_4a390ccc-d2fc-4e57-8dee-c1d2be126121', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/14 20:55:31 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573764925.63', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46901'}
19/11/14 20:55:31 INFO __init__: Creating state cache with size 0
19/11/14 20:55:31 INFO __init__: Creating insecure control channel for localhost:45735.
19/11/14 20:55:31 INFO __init__: Control channel established.
19/11/14 20:55:31 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/14 20:55:31 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/14 20:55:31 INFO create_state_handler: Creating insecure state channel for localhost:42807.
19/11/14 20:55:31 INFO create_state_handler: State channel established.
19/11/14 20:55:31 INFO create_data_channel: Creating client data channel for localhost:33657
19/11/14 20:55:31 INFO GrpcDataService: Beam Fn Data client connected.
19/11/14 20:55:31 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/14 20:55:31 INFO run: No more requests from control plane
19/11/14 20:55:31 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/14 20:55:31 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/14 20:55:31 INFO close: Closing all cached grpc data channels.
19/11/14 20:55:31 INFO close: Closing all cached gRPC state handlers.
19/11/14 20:55:31 INFO run: Done consuming work.
19/11/14 20:55:31 INFO main: Python sdk harness exiting.
19/11/14 20:55:31 INFO GrpcLoggingService: Logging client hanged up.
19/11/14 20:55:31 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/14 20:55:31 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/14 20:55:31 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 873 ms on localhost (executor driver) (1/1)
19/11/14 20:55:31 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/14 20:55:31 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.879 s
19/11/14 20:55:31 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.495475 s
19/11/14 20:55:31 INFO SparkPipelineRunner: Job test_windowing_1573764925.63_4a390ccc-d2fc-4e57-8dee-c1d2be126121 finished.
19/11/14 20:55:31 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/14 20:55:31 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestMgc2oe/job_3d8c8db3-b2da-4bba-8206-e0ed06e0325c/MANIFEST has 0 artifact locations
19/11/14 20:55:31 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestMgc2oe/job_3d8c8db3-b2da-4bba-8206-e0ed06e0325c/
INFO:root:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 229, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 435, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 326, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 435, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139692170696448)>

    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(Thread-120, started daemon 139692179089152)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
# Thread: <_MainThread(MainThread, started 139692958320384)>
==================== Timed out after 60 seconds. ====================

    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139692144207616)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-126, started daemon 139692152600320)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 497, # Thread: <_MainThread(MainThread, started 139692958320384)>

# Thread: <Thread(Thread-120, started daemon 139692179089152)>

# Thread: <Thread(wait_until_finish_read, started daemon 139692170696448)>
in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 445, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1573764915.52_542fa3f5-bcae-461a-8301-52245750649c failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 314.196s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 55s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/ypvagnruoulri

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python_VR_Spark #1737

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1737/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1736

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1736/display/redirect?page=changes>

Changes:

[kcweaver] Version Flink job server container images

[kcweaver] [BEAM-8337] publish Flink job server container images

[ningk] [BEAM-7926] Data-centric Interactive Part1

[kcweaver] Get Flink version numbers from subdirectories

[kcweaver] Warn if Flink versions can't be listed.


------------------------------------------
[...truncated 1.55 MB...]
19/12/10 01:08:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46883
19/12/10 01:08:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/10 01:08:02 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/10 01:08:02 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575940079.97_99e63056-ab2b-43ae-97e2-606351b62399', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/10 01:08:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575940079.97', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58733', 'job_port': u'0'}
19/12/10 01:08:02 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/10 01:08:02 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:40145.
19/12/10 01:08:02 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/10 01:08:02 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/10 01:08:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/10 01:08:02 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:40853.
19/12/10 01:08:02 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/10 01:08:02 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:42997
19/12/10 01:08:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/10 01:08:02 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/10 01:08:02 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/10 01:08:02 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/10 01:08:02 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/10 01:08:02 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/10 01:08:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/10 01:08:02 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/10 01:08:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/10 01:08:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/10 01:08:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/10 01:08:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/10 01:08:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/10 01:08:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/10 01:08:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:41917
19/12/10 01:08:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/10 01:08:03 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/10 01:08:03 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575940079.97_99e63056-ab2b-43ae-97e2-606351b62399', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/10 01:08:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575940079.97', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58733', 'job_port': u'0'}
19/12/10 01:08:03 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/10 01:08:03 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:39631.
19/12/10 01:08:03 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/10 01:08:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/10 01:08:03 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/10 01:08:03 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:40747.
19/12/10 01:08:03 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/10 01:08:03 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:32889
19/12/10 01:08:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/10 01:08:03 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/10 01:08:03 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/10 01:08:03 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/10 01:08:03 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/10 01:08:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/10 01:08:03 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/10 01:08:03 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/10 01:08:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/10 01:08:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/10 01:08:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/10 01:08:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/10 01:08:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/10 01:08:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/10 01:08:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:36863
19/12/10 01:08:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/10 01:08:04 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/10 01:08:04 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575940079.97_99e63056-ab2b-43ae-97e2-606351b62399', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/10 01:08:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575940079.97', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58733', 'job_port': u'0'}
19/12/10 01:08:04 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/10 01:08:04 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:34651.
19/12/10 01:08:04 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/10 01:08:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/10 01:08:04 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/10 01:08:04 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:44799.
19/12/10 01:08:04 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/10 01:08:04 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34641
19/12/10 01:08:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/10 01:08:04 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/10 01:08:04 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/10 01:08:04 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/10 01:08:04 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/10 01:08:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/10 01:08:04 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/10 01:08:04 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/10 01:08:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/10 01:08:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/10 01:08:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/10 01:08:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/10 01:08:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/10 01:08:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/10 01:08:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:38251
19/12/10 01:08:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/10 01:08:05 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/10 01:08:05 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575940079.97_99e63056-ab2b-43ae-97e2-606351b62399', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/10 01:08:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575940079.97', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58733', 'job_port': u'0'}
19/12/10 01:08:05 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/10 01:08:05 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:39609.
19/12/10 01:08:05 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/10 01:08:05 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/10 01:08:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/10 01:08:05 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43377.
19/12/10 01:08:05 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/10 01:08:05 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:37915
19/12/10 01:08:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/10 01:08:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/10 01:08:05 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/10 01:08:05 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/10 01:08:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/10 01:08:05 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/10 01:08:05 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/10 01:08:05 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/10 01:08:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/10 01:08:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/10 01:08:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/10 01:08:05 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575940079.97_99e63056-ab2b-43ae-97e2-606351b62399 finished.
19/12/10 01:08:05 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/10 01:08:05 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_be1ea735-f5f0-4aa1-aba9-404145ec7f6a","basePath":"/tmp/sparktestjR7oTh"}: {}
java.io.FileNotFoundException: /tmp/sparktestjR7oTh/job_be1ea735-f5f0-4aa1-aba9-404145ec7f6a/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 437, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
==================== Timed out after 60 seconds. ====================
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 437, in __exit__
    self.run().wait_until_finish()

  File "apache_beam/runners/portability/portable_ru# Thread: <Thread(wait_until_finish_read, started daemon 140280113014528)>

nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-119, started daemon 140280096229120)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140280892753664)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140279470356224)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-123, started daemon 140280086787840)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140280892753664)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-119, started daemon 140280096229120)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 437, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 140280113014528)>
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575940071.06_db01f780-054e-4e1f-89db-4815b11c31a8 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 319.342s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 58s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/5fxnkvyk5jluu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1735

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1735/display/redirect?page=changes>

Changes:

[pabloem] [BEAM-8335] Adds support for multi-output TestStream (#9953)


------------------------------------------
[...truncated 1.55 MB...]
19/12/09 22:32:13 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 22:32:13 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 22:32:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 22:32:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 22:32:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:33585
19/12/09 22:32:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 22:32:14 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 22:32:14 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575930732.01_04192864-f9f4-4557-80cb-e9c06487600c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 22:32:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575930732.01', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50649', 'job_port': u'0'}
19/12/09 22:32:14 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 22:32:14 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:34871.
19/12/09 22:32:14 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/09 22:32:14 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 22:32:14 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 22:32:14 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:45863.
19/12/09 22:32:14 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 22:32:14 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34697
19/12/09 22:32:14 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 22:32:14 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 22:32:14 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 22:32:14 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 22:32:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 22:32:14 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 22:32:14 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 22:32:14 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 22:32:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 22:32:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 22:32:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 22:32:14 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 22:32:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 22:32:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 22:32:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:40597
19/12/09 22:32:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 22:32:15 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 22:32:15 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575930732.01_04192864-f9f4-4557-80cb-e9c06487600c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 22:32:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575930732.01', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50649', 'job_port': u'0'}
19/12/09 22:32:15 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 22:32:15 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:37651.
19/12/09 22:32:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/09 22:32:15 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 22:32:15 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 22:32:15 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:37431.
19/12/09 22:32:15 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 22:32:15 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:37759
19/12/09 22:32:15 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 22:32:15 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 22:32:15 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 22:32:15 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 22:32:15 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 22:32:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 22:32:15 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 22:32:15 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 22:32:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 22:32:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 22:32:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 22:32:15 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 22:32:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 22:32:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 22:32:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:40605
19/12/09 22:32:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 22:32:16 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 22:32:16 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575930732.01_04192864-f9f4-4557-80cb-e9c06487600c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 22:32:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575930732.01', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50649', 'job_port': u'0'}
19/12/09 22:32:16 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 22:32:16 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:39769.
19/12/09 22:32:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/09 22:32:16 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 22:32:16 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 22:32:16 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:45149.
19/12/09 22:32:16 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 22:32:16 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:36083
19/12/09 22:32:16 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 22:32:16 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 22:32:16 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 22:32:16 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 22:32:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 22:32:16 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 22:32:16 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 22:32:16 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 22:32:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 22:32:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 22:32:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 22:32:16 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 22:32:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 22:32:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 22:32:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:38353
19/12/09 22:32:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 22:32:17 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 22:32:17 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575930732.01_04192864-f9f4-4557-80cb-e9c06487600c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 22:32:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575930732.01', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50649', 'job_port': u'0'}
19/12/09 22:32:17 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 22:32:17 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:34027.
19/12/09 22:32:17 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 22:32:17 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 22:32:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/09 22:32:17 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43745.
19/12/09 22:32:17 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 22:32:17 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:37715
19/12/09 22:32:17 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 22:32:17 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 22:32:17 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 22:32:17 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 22:32:17 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 22:32:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 22:32:17 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 22:32:17 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 22:32:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 22:32:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 22:32:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 22:32:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575930732.01_04192864-f9f4-4557-80cb-e9c06487600c finished.
19/12/09 22:32:17 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/09 22:32:17 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_a0f58f6c-4dba-44df-919e-3ce0502a6998","basePath":"/tmp/sparktest91Ft0Y"}: {}
java.io.FileNotFoundException: /tmp/sparktest91Ft0Y/job_a0f58f6c-4dba-44df-919e-3ce0502a6998/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
==================== Timed out after 60 seconds. ====================
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers

    assert_that(actual, equal_to(expected))
# Thread: <Thread(wait_until_finish_read, started daemon 139626339813120)>

# Thread: <Thread(Thread-116, started daemon 139626348205824)>

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <_MainThread(MainThread, started 139627127944960)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 139626313848576)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-122, started daemon 139626322241280)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 139627127944960)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575930723.54_4186c2fb-8121-47be-a20f-0933e14da306 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 294.635s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 31s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/ja5j63mpm6jru

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1734

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1734/display/redirect?page=changes>

Changes:

[heejong] [BEAM-8903] handling --jar_packages experimental flag in PortableRunner


------------------------------------------
[...truncated 1.55 MB...]
19/12/09 18:13:37 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:37727
19/12/09 18:13:37 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 18:13:37 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 18:13:37 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575915215.18_baab558b-d6c5-43ba-9edd-ed0bba7fa088', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 18:13:37 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575915215.18', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58009', 'job_port': u'0'}
19/12/09 18:13:37 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:44541.
19/12/09 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 18:13:37 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/09 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:39105.
19/12/09 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 18:13:37 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:36713
19/12/09 18:13:37 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 18:13:37 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 18:13:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 18:13:37 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 18:13:37 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 18:13:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 18:13:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 18:13:37 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 18:13:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:39215
19/12/09 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 18:13:38 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 18:13:38 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575915215.18_baab558b-d6c5-43ba-9edd-ed0bba7fa088', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575915215.18', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58009', 'job_port': u'0'}
19/12/09 18:13:38 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:41373.
19/12/09 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 18:13:38 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/09 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:40849.
19/12/09 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 18:13:38 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:43801
19/12/09 18:13:38 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 18:13:38 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 18:13:38 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 18:13:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 18:13:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 18:13:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 18:13:38 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 18:13:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 18:13:39 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 18:13:39 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:41651
19/12/09 18:13:39 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 18:13:39 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 18:13:39 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575915215.18_baab558b-d6c5-43ba-9edd-ed0bba7fa088', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 18:13:39 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575915215.18', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58009', 'job_port': u'0'}
19/12/09 18:13:39 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 18:13:39 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:45687.
19/12/09 18:13:39 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 18:13:39 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 18:13:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/09 18:13:39 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43385.
19/12/09 18:13:39 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 18:13:39 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:38781
19/12/09 18:13:39 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 18:13:39 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 18:13:39 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 18:13:39 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 18:13:39 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 18:13:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 18:13:39 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 18:13:39 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 18:13:39 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 18:13:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 18:13:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 18:13:39 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 18:13:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 18:13:40 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 18:13:40 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:39989
19/12/09 18:13:40 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 18:13:40 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 18:13:40 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575915215.18_baab558b-d6c5-43ba-9edd-ed0bba7fa088', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 18:13:40 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575915215.18', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58009', 'job_port': u'0'}
19/12/09 18:13:40 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 18:13:40 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:41587.
19/12/09 18:13:40 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 18:13:40 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 18:13:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/09 18:13:40 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:34507.
19/12/09 18:13:40 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 18:13:40 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:39195
19/12/09 18:13:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 18:13:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 18:13:40 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 18:13:40 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 18:13:40 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 18:13:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 18:13:40 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 18:13:40 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 18:13:40 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 18:13:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 18:13:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 18:13:40 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575915215.18_baab558b-d6c5-43ba-9edd-ed0bba7fa088 finished.
19/12/09 18:13:40 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/09 18:13:40 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_4de5cfcd-0dad-4e22-9d73-f8f0b1992961","basePath":"/tmp/sparktestrNVtlv"}: {}
java.io.FileNotFoundException: /tmp/sparktestrNVtlv/job_4de5cfcd-0dad-4e22-9d73-f8f0b1992961/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

==================== Timed out after 60 seconds. ====================
======================================================================

ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
# Thread: <Thread(wait_until_finish_read, started daemon 139728162645760)>

# Thread: <Thread(Thread-120, started daemon 139728154253056)>

# Thread: <_MainThread(MainThread, started 139729287423744)>
==================== Timed out after 60 seconds. ====================

Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 139728137467648)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(Thread-124, started daemon 139728145860352)>

# Thread: <Thread(Thread-120, started daemon 139728154253056)>

  File "apache_beam/runners/portability/portable_ru# Thread: <Thread(wait_until_finish_read, started daemon 139728162645760)>

nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 139729287423744)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575915207.26_5894b3c6-6fb0-4395-9732-c8e5700a2208 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 344.483s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 20s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/wmzs676h6pnly

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1733

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1733/display/redirect?page=changes>

Changes:

[dcavazos] [BEAM-7390] Add code snippet for Mean

[nielm] Add limit on number of mutated rows to batching/sorting stages.


------------------------------------------
[...truncated 1.54 MB...]
19/12/09 17:50:25 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575913824.26_55dcbf41-75bc-4cd6-90c1-110dc0bb998e on Spark master local
19/12/09 17:50:25 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/09 17:50:25 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575913824.26_55dcbf41-75bc-4cd6-90c1-110dc0bb998e: Pipeline translated successfully. Computing outputs
19/12/09 17:50:25 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 17:50:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 17:50:25 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 17:50:25 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:37601
19/12/09 17:50:25 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 17:50:25 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 17:50:25 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575913824.26_55dcbf41-75bc-4cd6-90c1-110dc0bb998e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 17:50:25 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575913824.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44153', 'job_port': u'0'}
19/12/09 17:50:25 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 17:50:25 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:45333.
19/12/09 17:50:25 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 17:50:25 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 17:50:25 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 258-1
19/12/09 17:50:25 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:34645.
19/12/09 17:50:25 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 17:50:25 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:44701
19/12/09 17:50:25 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 17:50:25 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 17:50:25 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 17:50:25 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 17:50:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:25 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 17:50:25 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 17:50:25 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 17:50:25 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 17:50:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 17:50:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:26 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 17:50:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 17:50:26 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 17:50:26 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:42917
19/12/09 17:50:26 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 17:50:26 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 17:50:26 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575913824.26_55dcbf41-75bc-4cd6-90c1-110dc0bb998e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 17:50:26 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575913824.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44153', 'job_port': u'0'}
19/12/09 17:50:26 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 17:50:26 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:38947.
19/12/09 17:50:26 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/09 17:50:26 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 17:50:26 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 17:50:26 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:42731.
19/12/09 17:50:26 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 17:50:26 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:43735
19/12/09 17:50:26 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 17:50:26 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 17:50:26 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 17:50:26 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 17:50:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:26 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 17:50:26 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 17:50:26 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 17:50:26 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 17:50:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 17:50:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:26 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 17:50:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 17:50:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 17:50:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:36873
19/12/09 17:50:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 17:50:27 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 17:50:27 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575913824.26_55dcbf41-75bc-4cd6-90c1-110dc0bb998e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 17:50:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575913824.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44153', 'job_port': u'0'}
19/12/09 17:50:27 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 17:50:27 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:35647.
19/12/09 17:50:27 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 17:50:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/09 17:50:27 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 17:50:27 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:39271.
19/12/09 17:50:27 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 17:50:27 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 17:50:27 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:42263
19/12/09 17:50:27 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 17:50:27 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 17:50:27 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 17:50:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:27 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 17:50:27 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 17:50:27 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 17:50:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 17:50:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 17:50:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 17:50:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 17:50:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 17:50:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:39309
19/12/09 17:50:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 17:50:28 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 17:50:28 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575913824.26_55dcbf41-75bc-4cd6-90c1-110dc0bb998e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 17:50:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575913824.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44153', 'job_port': u'0'}
19/12/09 17:50:28 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 17:50:28 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:44753.
19/12/09 17:50:28 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/09 17:50:28 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 17:50:28 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 17:50:28 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:45279.
19/12/09 17:50:28 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 17:50:28 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:42871
19/12/09 17:50:28 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 17:50:28 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 17:50:28 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 17:50:28 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 17:50:28 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 17:50:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:28 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 17:50:28 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 17:50:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 17:50:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 17:50:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:28 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 17:50:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 17:50:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 17:50:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:44081
19/12/09 17:50:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 17:50:29 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 17:50:29 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575913824.26_55dcbf41-75bc-4cd6-90c1-110dc0bb998e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 17:50:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575913824.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44153', 'job_port': u'0'}
19/12/09 17:50:29 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 17:50:29 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:45375.
19/12/09 17:50:29 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 17:50:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/09 17:50:29 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 17:50:29 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:45793.
19/12/09 17:50:29 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 17:50:29 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:35891
19/12/09 17:50:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 17:50:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 17:50:29 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 17:50:29 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 17:50:29 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 17:50:29 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 17:50:29 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 17:50:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 17:50:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 17:50:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:29 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575913824.26_55dcbf41-75bc-4cd6-90c1-110dc0bb998e finished.
19/12/09 17:50:29 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/09 17:50:29 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_611a5cae-20e5-408f-a1d5-b36b36a8a2a2","basePath":"/tmp/sparktestKn42fl"}: {}
java.io.FileNotFoundException: /tmp/sparktestKn42fl/job_611a5cae-20e5-408f-a1d5-b36b36a8a2a2/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
==================== Timed out after 60 seconds. ====================
Traceback (most recent call last):

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
# Thread: <Thread(wait_until_finish_read, started daemon 140431663650560)>

    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(Thread-119, started daemon 140431655257856)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575913816.41_e7496de5-5756-4596-b053-c4a306ceff8b failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
# Thread: <_MainThread(MainThread, started 140432443389696)>
Ran 38 tests in 263.596s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 55s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/lb743pq37khum

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1732

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1732/display/redirect?page=changes>

Changes:

[github] Changing RowAsDictJsonCoder implementation for efficiency (#10300)

[github] Merge pull request #10151: [BEAM-7116] Remove use of KV in Schema


------------------------------------------
[...truncated 1.54 MB...]
19/12/09 16:58:31 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575910710.89_cf097cfe-16f5-42ec-8002-4d9af9623a9f on Spark master local
19/12/09 16:58:31 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/09 16:58:31 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575910710.89_cf097cfe-16f5-42ec-8002-4d9af9623a9f: Pipeline translated successfully. Computing outputs
19/12/09 16:58:31 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 16:58:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 16:58:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 16:58:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:40919
19/12/09 16:58:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 16:58:33 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 16:58:33 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575910710.89_cf097cfe-16f5-42ec-8002-4d9af9623a9f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 16:58:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575910710.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40169', 'job_port': u'0'}
19/12/09 16:58:33 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 16:58:33 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:36993.
19/12/09 16:58:33 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 16:58:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 258-1
19/12/09 16:58:33 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 16:58:33 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:39161.
19/12/09 16:58:33 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 16:58:33 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:46663
19/12/09 16:58:33 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 16:58:33 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 16:58:33 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 16:58:33 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 16:58:33 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 16:58:33 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 16:58:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:33 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 16:58:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 16:58:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 16:58:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:33 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 16:58:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 16:58:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 16:58:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:37491
19/12/09 16:58:34 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 16:58:34 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 16:58:34 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575910710.89_cf097cfe-16f5-42ec-8002-4d9af9623a9f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 16:58:34 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575910710.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40169', 'job_port': u'0'}
19/12/09 16:58:34 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:33405.
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:41857.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 16:58:34 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:46185
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 16:58:34 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 16:58:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 16:58:34 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 16:58:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 16:58:34 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 16:58:34 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:35791
19/12/09 16:58:34 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 16:58:34 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 16:58:34 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575910710.89_cf097cfe-16f5-42ec-8002-4d9af9623a9f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 16:58:34 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575910710.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40169', 'job_port': u'0'}
19/12/09 16:58:34 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:38381.
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:44513.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 16:58:34 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:38493
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 16:58:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:34 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 16:58:34 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 16:58:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 16:58:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 16:58:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 16:58:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:39593
19/12/09 16:58:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 16:58:35 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 16:58:35 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575910710.89_cf097cfe-16f5-42ec-8002-4d9af9623a9f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 16:58:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575910710.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40169', 'job_port': u'0'}
19/12/09 16:58:35 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 16:58:35 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:44567.
19/12/09 16:58:35 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 16:58:35 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 16:58:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/09 16:58:35 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:35541.
19/12/09 16:58:35 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 16:58:35 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:41561
19/12/09 16:58:35 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 16:58:35 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 16:58:35 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 16:58:35 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 16:58:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:35 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 16:58:35 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 16:58:35 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 16:58:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 16:58:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 16:58:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 16:58:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 16:58:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 16:58:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:33543
19/12/09 16:58:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 16:58:36 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 16:58:36 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575910710.89_cf097cfe-16f5-42ec-8002-4d9af9623a9f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 16:58:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575910710.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40169', 'job_port': u'0'}
19/12/09 16:58:36 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 16:58:36 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:45585.
19/12/09 16:58:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/09 16:58:36 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 16:58:36 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 16:58:36 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:36269.
19/12/09 16:58:36 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 16:58:36 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:36271
19/12/09 16:58:36 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 16:58:36 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 16:58:36 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 16:58:36 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 16:58:36 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 16:58:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:36 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 16:58:36 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 16:58:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 16:58:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 16:58:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:36 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575910710.89_cf097cfe-16f5-42ec-8002-4d9af9623a9f finished.
19/12/09 16:58:36 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/09 16:58:36 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_26fbcb58-6bf1-428b-9821-76f67e85271a","basePath":"/tmp/sparktestYC2f7d"}: {}
java.io.FileNotFoundException: /tmp/sparktestYC2f7d/job_26fbcb58-6bf1-428b-9821-76f67e85271a/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
==================== Timed out after 60 seconds. ====================

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575910703.23_f383ba58-4745-4b22-ac38-2d7365a9bd69 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <Thread(wait_until_finish_read, started daemon 140595631769344)>

# Thread: <Thread(Thread-120, started daemon 140595284670208)>

----------------------------------------------------------------------
Ran 38 tests in 286.679s

# Thread: <_MainThread(MainThread, started 140596411008768)>
FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 19s
60 actionable tasks: 56 executed, 4 from cache

Publishing build scan...
https://scans.gradle.com/s/jzxwv7uqdgftq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1731

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1731/display/redirect?page=changes>

Changes:

[michal.walenia] [BEAM-8895] Add BigQuery table name sanitization to BigQueryIOIT

[michal.walenia] [BEAM-8918] Split batch BQIOIT into avro and json using tests


------------------------------------------
[...truncated 1.55 MB...]
19/12/09 12:53:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:36339
19/12/09 12:53:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 12:53:14 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 12:53:14 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575895992.03_c97f5a38-9831-429e-90aa-52ea4b57eb68', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 12:53:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575895992.03', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35417', 'job_port': u'0'}
19/12/09 12:53:14 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 12:53:14 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:38649.
19/12/09 12:53:14 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/09 12:53:14 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 12:53:14 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 12:53:14 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:46525.
19/12/09 12:53:14 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 12:53:14 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:46527
19/12/09 12:53:14 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 12:53:14 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 12:53:14 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 12:53:14 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 12:53:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:53:14 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 12:53:14 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 12:53:14 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 12:53:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 12:53:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 12:53:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:53:14 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 12:53:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 12:53:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 12:53:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:40829
19/12/09 12:53:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 12:53:15 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 12:53:15 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575895992.03_c97f5a38-9831-429e-90aa-52ea4b57eb68', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 12:53:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575895992.03', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35417', 'job_port': u'0'}
19/12/09 12:53:15 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 12:53:15 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:37033.
19/12/09 12:53:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/09 12:53:15 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 12:53:15 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 12:53:15 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:34659.
19/12/09 12:53:15 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 12:53:15 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:45653
19/12/09 12:53:15 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 12:53:15 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 12:53:15 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 12:53:15 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 12:53:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:53:15 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 12:53:15 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 12:53:15 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 12:53:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 12:53:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 12:53:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:53:15 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 12:53:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 12:53:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 12:53:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46305
19/12/09 12:53:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 12:53:16 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 12:53:16 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575895992.03_c97f5a38-9831-429e-90aa-52ea4b57eb68', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 12:53:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575895992.03', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35417', 'job_port': u'0'}
19/12/09 12:53:16 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 12:53:16 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:33491.
19/12/09 12:53:16 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 12:53:16 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 12:53:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/09 12:53:16 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:39537.
19/12/09 12:53:16 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 12:53:16 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:46153
19/12/09 12:53:16 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 12:53:16 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 12:53:16 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 12:53:16 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 12:53:16 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 12:53:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:53:16 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 12:53:16 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 12:53:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 12:53:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 12:53:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:53:16 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 12:53:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 12:53:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 12:53:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:41155
19/12/09 12:53:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 12:53:17 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 12:53:17 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575895992.03_c97f5a38-9831-429e-90aa-52ea4b57eb68', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 12:53:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575895992.03', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35417', 'job_port': u'0'}
19/12/09 12:53:17 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 12:53:17 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:37345.
19/12/09 12:53:17 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 12:53:17 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 12:53:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/09 12:53:17 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38467.
19/12/09 12:53:17 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 12:53:17 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:46013
19/12/09 12:53:17 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 12:53:17 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 12:53:17 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 12:53:17 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 12:53:17 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 12:53:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:53:17 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 12:53:17 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 12:53:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 12:53:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 12:53:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:53:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575895992.03_c97f5a38-9831-429e-90aa-52ea4b57eb68 finished.
19/12/09 12:53:17 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/09 12:53:17 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_19057db3-6693-4158-bb4c-e570b713686b","basePath":"/tmp/sparktestI3AjBJ"}: {}
java.io.FileNotFoundException: /tmp/sparktestI3AjBJ/job_19057db3-6693-4158-bb4c-e570b713686b/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
==================== Timed out after 60 seconds. ====================
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.


# Thread: <Thread(wait_until_finish_read, started daemon 140595502864128)>

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
# Thread: <Thread(Thread-119, started daemon 140595494471424)>

----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <_MainThread(MainThread, started 140596290995968)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
# Thread: <Thread(wait_until_finish_read, started daemon 140595007579904)>

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(Thread-125, started daemon 140595015972608)>

# Thread: <Thread(Thread-119, started daemon 140595494471424)>

# Thread: <Thread(wait_until_finish_read, started daemon 140595502864128)>

  File "apache_beam/runners/portability/portable_ru# Thread: <_MainThread(MainThread, started 140596290995968)>
nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575895983.48_ea2a82bc-014e-4c72-9363-80a6c5c6ce41 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 312.712s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 14s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/kszdjin5vdoww

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1730

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1730/display/redirect>

Changes:


------------------------------------------
[...truncated 1.54 MB...]
19/12/09 12:13:50 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575893629.23_eb025089-2f52-4119-bc90-75f4501347dc on Spark master local
19/12/09 12:13:50 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/09 12:13:50 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575893629.23_eb025089-2f52-4119-bc90-75f4501347dc: Pipeline translated successfully. Computing outputs
19/12/09 12:13:50 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 12:13:50 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 12:13:50 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 12:13:50 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:41237
19/12/09 12:13:50 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 12:13:50 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 12:13:50 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575893629.23_eb025089-2f52-4119-bc90-75f4501347dc', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 12:13:50 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575893629.23', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43627', 'job_port': u'0'}
19/12/09 12:13:50 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:37729.
19/12/09 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 12:13:50 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 258-1
19/12/09 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:45755.
19/12/09 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 12:13:50 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:45137
19/12/09 12:13:50 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 12:13:50 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 12:13:50 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:50 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 12:13:50 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 12:13:50 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 12:13:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:51 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 12:13:51 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46495
19/12/09 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 12:13:51 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 12:13:51 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575893629.23_eb025089-2f52-4119-bc90-75f4501347dc', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575893629.23', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43627', 'job_port': u'0'}
19/12/09 12:13:51 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:41469.
19/12/09 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 12:13:51 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/09 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:37459.
19/12/09 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 12:13:51 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:41015
19/12/09 12:13:51 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 12:13:51 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 12:13:51 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 12:13:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 12:13:51 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 12:13:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:51 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 12:13:52 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:44165
19/12/09 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 12:13:52 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 12:13:52 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575893629.23_eb025089-2f52-4119-bc90-75f4501347dc', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575893629.23', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43627', 'job_port': u'0'}
19/12/09 12:13:52 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:35463.
19/12/09 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 12:13:52 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/09 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:35545.
19/12/09 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 12:13:52 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:40121
19/12/09 12:13:52 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 12:13:52 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 12:13:52 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:52 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 12:13:52 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 12:13:52 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:52 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 12:13:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:33959
19/12/09 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 12:13:53 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 12:13:53 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575893629.23_eb025089-2f52-4119-bc90-75f4501347dc', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575893629.23', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43627', 'job_port': u'0'}
19/12/09 12:13:53 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43591.
19/12/09 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 12:13:53 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/09 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:34465.
19/12/09 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 12:13:53 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:43919
19/12/09 12:13:53 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 12:13:53 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 12:13:53 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 12:13:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 12:13:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 12:13:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:53 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 12:13:54 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 12:13:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 12:13:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46121
19/12/09 12:13:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 12:13:54 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 12:13:54 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575893629.23_eb025089-2f52-4119-bc90-75f4501347dc', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 12:13:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575893629.23', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43627', 'job_port': u'0'}
19/12/09 12:13:54 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 12:13:54 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:44817.
19/12/09 12:13:54 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/09 12:13:54 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 12:13:54 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 12:13:54 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38255.
19/12/09 12:13:54 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 12:13:54 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34501
19/12/09 12:13:54 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 12:13:54 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 12:13:54 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 12:13:54 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 12:13:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:54 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 12:13:54 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 12:13:54 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 12:13:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 12:13:54 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 12:13:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:54 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575893629.23_eb025089-2f52-4119-bc90-75f4501347dc finished.
19/12/09 12:13:54 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/09 12:13:54 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_edf0ba45-d07a-4d5a-9618-2a573fb5acc8","basePath":"/tmp/sparktestOPI8ZM"}: {}
java.io.FileNotFoundException: /tmp/sparktestOPI8ZM/job_edf0ba45-d07a-4d5a-9618-2a573fb5acc8/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
==================== Timed out after 60 seconds. ====================
  File "/usr/lib/python2.7/threading.py", line 359, in wait

# Thread: <Thread(wait_until_finish_read, started daemon 139980113262336)>

    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
# Thread: <Thread(Thread-120, started daemon 139980381812480)>

BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 139980900333312)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575893621.28_e8f13f82-1cc0-400a-903a-80a1d1530c27 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 274.506s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 0s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/2bqu2v23gbm7c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1729

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1729/display/redirect>

Changes:


------------------------------------------
[...truncated 1.54 MB...]
19/12/09 06:12:58 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575871977.89_8789b5f2-f332-457c-8345-3c8b23e54d58 on Spark master local
19/12/09 06:12:58 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/09 06:12:58 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575871977.89_8789b5f2-f332-457c-8345-3c8b23e54d58: Pipeline translated successfully. Computing outputs
19/12/09 06:12:58 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 06:12:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 06:12:59 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 06:12:59 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:35495
19/12/09 06:12:59 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 06:12:59 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 06:12:59 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575871977.89_8789b5f2-f332-457c-8345-3c8b23e54d58', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 06:12:59 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575871977.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54893', 'job_port': u'0'}
19/12/09 06:12:59 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 06:12:59 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:36409.
19/12/09 06:12:59 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 06:12:59 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 258-1
19/12/09 06:12:59 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 06:12:59 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:36613.
19/12/09 06:12:59 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 06:12:59 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:42563
19/12/09 06:12:59 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 06:12:59 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 06:12:59 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 06:12:59 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 06:12:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:12:59 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 06:12:59 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 06:12:59 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 06:12:59 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 06:12:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 06:12:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:12:59 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 06:13:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 06:13:00 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 06:13:00 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46503
19/12/09 06:13:00 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 06:13:00 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 06:13:00 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575871977.89_8789b5f2-f332-457c-8345-3c8b23e54d58', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 06:13:00 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575871977.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54893', 'job_port': u'0'}
19/12/09 06:13:00 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 06:13:00 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:40351.
19/12/09 06:13:00 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 06:13:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/09 06:13:00 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 06:13:00 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:33413.
19/12/09 06:13:00 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 06:13:00 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:33735
19/12/09 06:13:00 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 06:13:00 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 06:13:00 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 06:13:00 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 06:13:00 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 06:13:00 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 06:13:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:13:00 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 06:13:00 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 06:13:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 06:13:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:13:00 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 06:13:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 06:13:01 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 06:13:01 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46761
19/12/09 06:13:01 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 06:13:01 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 06:13:01 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575871977.89_8789b5f2-f332-457c-8345-3c8b23e54d58', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 06:13:01 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575871977.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54893', 'job_port': u'0'}
19/12/09 06:13:01 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 06:13:01 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:33025.
19/12/09 06:13:01 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/09 06:13:01 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 06:13:01 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 06:13:01 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43703.
19/12/09 06:13:01 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 06:13:01 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:33801
19/12/09 06:13:01 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 06:13:01 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 06:13:01 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 06:13:01 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 06:13:01 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 06:13:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:13:01 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 06:13:01 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 06:13:01 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 06:13:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 06:13:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:13:01 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 06:13:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 06:13:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 06:13:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:34999
19/12/09 06:13:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 06:13:02 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 06:13:02 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575871977.89_8789b5f2-f332-457c-8345-3c8b23e54d58', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 06:13:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575871977.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54893', 'job_port': u'0'}
19/12/09 06:13:02 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:40177.
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 06:13:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:42649.
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 06:13:02 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:39097
19/12/09 06:13:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 06:13:02 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 06:13:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:13:02 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 06:13:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 06:13:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 06:13:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:13:02 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 06:13:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 06:13:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 06:13:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46309
19/12/09 06:13:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 06:13:02 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 06:13:02 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575871977.89_8789b5f2-f332-457c-8345-3c8b23e54d58', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 06:13:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575871977.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54893', 'job_port': u'0'}
19/12/09 06:13:02 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:44493.
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 06:13:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:39143.
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 06:13:03 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:37615
19/12/09 06:13:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 06:13:03 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 06:13:03 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 06:13:03 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 06:13:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:13:03 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 06:13:03 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 06:13:03 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 06:13:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 06:13:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 06:13:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:13:03 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575871977.89_8789b5f2-f332-457c-8345-3c8b23e54d58 finished.
19/12/09 06:13:03 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/09 06:13:03 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_018d9b40-6542-4f5d-8665-7f436242bd62","basePath":"/tmp/sparktestP9l80U"}: {}
java.io.FileNotFoundException: /tmp/sparktestP9l80U/job_018d9b40-6542-4f5d-8665-7f436242bd62/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140230031533824)>

    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(Thread-119, started daemon 140230039926528)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <_MainThread(MainThread, started 140230828058368)>
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575871969.54_dd5ea035-13c9-4976-a184-9f94407f49a7 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 297.799s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 50s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/wzutcuyg3z7so

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1728

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1728/display/redirect>

Changes:


------------------------------------------
[...truncated 1.55 MB...]
19/12/09 00:12:42 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:34137
19/12/09 00:12:42 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 00:12:42 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 00:12:42 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575850359.75_2fde8fc6-9f14-4730-8244-e1b350629713', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 00:12:42 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575850359.75', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57041', 'job_port': u'0'}
19/12/09 00:12:42 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 00:12:42 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43567.
19/12/09 00:12:42 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/09 00:12:42 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 00:12:42 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 00:12:42 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:33645.
19/12/09 00:12:42 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 00:12:42 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:41005
19/12/09 00:12:42 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 00:12:42 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 00:12:42 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 00:12:42 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 00:12:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 00:12:42 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 00:12:42 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 00:12:42 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 00:12:42 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 00:12:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 00:12:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 00:12:43 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 00:12:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 00:12:43 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 00:12:43 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:40833
19/12/09 00:12:43 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 00:12:43 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 00:12:43 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575850359.75_2fde8fc6-9f14-4730-8244-e1b350629713', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 00:12:43 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575850359.75', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57041', 'job_port': u'0'}
19/12/09 00:12:43 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 00:12:43 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:41901.
19/12/09 00:12:43 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 00:12:43 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 00:12:43 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/09 00:12:43 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:44979.
19/12/09 00:12:43 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 00:12:43 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34941
19/12/09 00:12:43 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 00:12:43 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 00:12:43 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 00:12:43 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 00:12:43 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 00:12:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 00:12:43 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 00:12:43 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 00:12:43 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 00:12:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 00:12:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 00:12:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 00:12:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 00:12:44 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 00:12:44 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:33131
19/12/09 00:12:44 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 00:12:44 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 00:12:44 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575850359.75_2fde8fc6-9f14-4730-8244-e1b350629713', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 00:12:44 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575850359.75', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57041', 'job_port': u'0'}
19/12/09 00:12:44 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 00:12:44 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:35059.
19/12/09 00:12:44 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 00:12:44 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/09 00:12:44 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 00:12:44 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:46667.
19/12/09 00:12:44 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 00:12:44 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:41459
19/12/09 00:12:44 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 00:12:44 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 00:12:44 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 00:12:44 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 00:12:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 00:12:44 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 00:12:44 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 00:12:44 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 00:12:44 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 00:12:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 00:12:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 00:12:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 00:12:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 00:12:45 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 00:12:45 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:36275
19/12/09 00:12:45 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 00:12:45 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 00:12:45 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575850359.75_2fde8fc6-9f14-4730-8244-e1b350629713', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 00:12:45 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575850359.75', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57041', 'job_port': u'0'}
19/12/09 00:12:45 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 00:12:45 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:46437.
19/12/09 00:12:45 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 00:12:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/09 00:12:45 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 00:12:45 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:35153.
19/12/09 00:12:45 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 00:12:45 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34015
19/12/09 00:12:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 00:12:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 00:12:45 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 00:12:45 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 00:12:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 00:12:45 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 00:12:45 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 00:12:45 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 00:12:45 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 00:12:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 00:12:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 00:12:45 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575850359.75_2fde8fc6-9f14-4730-8244-e1b350629713 finished.
19/12/09 00:12:45 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/09 00:12:45 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_e82fa660-183a-4189-9915-1747c8f5b470","basePath":"/tmp/sparktestv2Lc9c"}: {}
java.io.FileNotFoundException: /tmp/sparktestv2Lc9c/job_e82fa660-183a-4189-9915-1747c8f5b470/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
==================== Timed out after 60 seconds. ====================
  File "/usr/lib/python2.7/threading.py", line 359, in wait

# Thread: <Thread(wait_until_finish_read, started daemon 140220864980736)>

    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-115, started daemon 140220848195328)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
# Thread: <_MainThread(MainThread, started 140221644719872)>
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140220839802624)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-121, started daemon 140220831409920)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(Thread-115, started daemon 140220848195328)>

# Thread: <_MainThread(MainThread, started 140221644719872)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(wait_until_finish_read, started daemon 140220864980736)>
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575850351.13_e116f408-4977-49cc-b93b-81ef9d4450ff failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 333.530s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 26s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/ozswi6dgw6dyq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1727

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1727/display/redirect>

Changes:


------------------------------------------
[...truncated 1.55 MB...]
19/12/08 18:22:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 18:22:02 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 18:22:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 18:22:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 18:22:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:37313
19/12/08 18:22:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 18:22:02 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 18:22:02 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575829320.32_0df509fe-6c0d-492a-9932-d08df59871c2', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 18:22:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575829320.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46469', 'job_port': u'0'}
19/12/08 18:22:02 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 18:22:02 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:39133.
19/12/08 18:22:02 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 18:22:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/08 18:22:02 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 18:22:02 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:41259.
19/12/08 18:22:02 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 18:22:02 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:36747
19/12/08 18:22:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 18:22:02 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 18:22:02 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 18:22:02 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 18:22:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 18:22:02 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 18:22:02 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 18:22:02 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 18:22:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 18:22:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 18:22:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 18:22:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 18:22:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 18:22:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 18:22:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:34205
19/12/08 18:22:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 18:22:03 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 18:22:03 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575829320.32_0df509fe-6c0d-492a-9932-d08df59871c2', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 18:22:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575829320.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46469', 'job_port': u'0'}
19/12/08 18:22:03 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 18:22:03 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43965.
19/12/08 18:22:03 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 18:22:03 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 18:22:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/08 18:22:03 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43973.
19/12/08 18:22:03 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 18:22:03 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:35863
19/12/08 18:22:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 18:22:03 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 18:22:03 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 18:22:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 18:22:03 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 18:22:03 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 18:22:03 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 18:22:03 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 18:22:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 18:22:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 18:22:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 18:22:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 18:22:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 18:22:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 18:22:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:34991
19/12/08 18:22:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 18:22:04 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 18:22:04 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575829320.32_0df509fe-6c0d-492a-9932-d08df59871c2', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 18:22:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575829320.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46469', 'job_port': u'0'}
19/12/08 18:22:04 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 18:22:04 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43483.
19/12/08 18:22:04 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 18:22:04 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 18:22:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/08 18:22:04 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38133.
19/12/08 18:22:04 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 18:22:04 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:40907
19/12/08 18:22:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 18:22:04 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 18:22:04 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 18:22:04 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 18:22:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 18:22:04 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 18:22:04 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 18:22:04 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 18:22:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 18:22:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 18:22:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 18:22:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 18:22:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 18:22:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 18:22:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:32841
19/12/08 18:22:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 18:22:05 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 18:22:05 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575829320.32_0df509fe-6c0d-492a-9932-d08df59871c2', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 18:22:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575829320.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46469', 'job_port': u'0'}
19/12/08 18:22:05 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 18:22:05 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:39443.
19/12/08 18:22:05 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 18:22:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/08 18:22:05 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 18:22:05 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:39489.
19/12/08 18:22:05 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 18:22:05 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:33279
19/12/08 18:22:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 18:22:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 18:22:05 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 18:22:05 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 18:22:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 18:22:05 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 18:22:05 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 18:22:05 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 18:22:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 18:22:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 18:22:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 18:22:05 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575829320.32_0df509fe-6c0d-492a-9932-d08df59871c2 finished.
19/12/08 18:22:05 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/08 18:22:05 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_17233b3d-973c-48b1-ac71-d98d3b7dc085","basePath":"/tmp/sparktestBWmh4O"}: {}
java.io.FileNotFoundException: /tmp/sparktestBWmh4O/job_17233b3d-973c-48b1-ac71-d98d3b7dc085/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
==================== Timed out after 60 seconds. ====================
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()

# Thread: <Thread(wait_until_finish_read, started daemon 140073030014720)>

# Thread: <Thread(Thread-118, started daemon 140073021622016)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 140073809753856)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 140072996443904)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(Thread-124, started daemon 140073004836608)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140073809753856)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575829312.25_948cde46-6c94-43be-a48e-681386c6a93d failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 297.515s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 28s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/nx25marmktgls

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1726

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1726/display/redirect>

Changes:


------------------------------------------
[...truncated 1.55 MB...]
19/12/08 12:14:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:38399
19/12/08 12:14:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 12:14:02 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 12:14:02 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575807239.54_9fec38bf-aa74-45f2-bc0b-703d73146d76', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 12:14:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575807239.54', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34663', 'job_port': u'0'}
19/12/08 12:14:02 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 12:14:02 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:40339.
19/12/08 12:14:02 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 12:14:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/08 12:14:02 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 12:14:02 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:45805.
19/12/08 12:14:02 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 12:14:02 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:45337
19/12/08 12:14:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 12:14:02 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 12:14:02 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 12:14:02 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 12:14:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 12:14:02 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 12:14:02 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 12:14:02 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 12:14:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 12:14:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 12:14:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 12:14:02 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 12:14:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 12:14:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 12:14:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:40855
19/12/08 12:14:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 12:14:03 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 12:14:03 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575807239.54_9fec38bf-aa74-45f2-bc0b-703d73146d76', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 12:14:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575807239.54', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34663', 'job_port': u'0'}
19/12/08 12:14:03 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 12:14:03 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:38645.
19/12/08 12:14:03 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 12:14:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/08 12:14:03 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 12:14:03 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:46093.
19/12/08 12:14:03 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 12:14:03 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:37971
19/12/08 12:14:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 12:14:03 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 12:14:03 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 12:14:03 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 12:14:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 12:14:03 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 12:14:03 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 12:14:03 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 12:14:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 12:14:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 12:14:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 12:14:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 12:14:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 12:14:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 12:14:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:38295
19/12/08 12:14:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 12:14:04 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 12:14:04 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575807239.54_9fec38bf-aa74-45f2-bc0b-703d73146d76', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 12:14:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575807239.54', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34663', 'job_port': u'0'}
19/12/08 12:14:04 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 12:14:04 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43681.
19/12/08 12:14:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/08 12:14:04 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 12:14:04 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 12:14:04 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38131.
19/12/08 12:14:04 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 12:14:04 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:37679
19/12/08 12:14:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 12:14:04 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 12:14:04 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 12:14:04 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 12:14:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 12:14:04 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 12:14:04 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 12:14:04 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 12:14:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 12:14:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 12:14:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 12:14:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 12:14:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 12:14:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 12:14:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:45865
19/12/08 12:14:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 12:14:05 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 12:14:05 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575807239.54_9fec38bf-aa74-45f2-bc0b-703d73146d76', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 12:14:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575807239.54', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34663', 'job_port': u'0'}
19/12/08 12:14:05 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 12:14:05 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:42457.
19/12/08 12:14:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/08 12:14:05 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 12:14:05 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 12:14:05 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:36157.
19/12/08 12:14:05 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 12:14:05 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:44057
19/12/08 12:14:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 12:14:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 12:14:05 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 12:14:05 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 12:14:05 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 12:14:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 12:14:05 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 12:14:05 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 12:14:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 12:14:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 12:14:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 12:14:05 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575807239.54_9fec38bf-aa74-45f2-bc0b-703d73146d76 finished.
19/12/08 12:14:05 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/08 12:14:05 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_0246cfcd-eb11-4757-9fba-cb3307f76a80","basePath":"/tmp/sparktest5ZyTYL"}: {}
java.io.FileNotFoundException: /tmp/sparktest5ZyTYL/job_0246cfcd-eb11-4757-9fba-cb3307f76a80/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140172986742528)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
# Thread: <Thread(Thread-118, started daemon 140172978349824)>

# Thread: <_MainThread(MainThread, started 140173766481664)>
==================== Timed out after 60 seconds. ====================

    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 140172490434304)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-124, started daemon 140172482041600)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-118, started daemon 140172978349824)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
# Thread: <_MainThread(MainThread, started 140173766481664)>

# Thread: <Thread(wait_until_finish_read, started daemon 140172986742528)>
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575807231.22_bd303a65-7058-424f-8826-ee7f6036d19b failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 318.714s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 10s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/2sycxr64ilxo2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1725

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1725/display/redirect>

Changes:


------------------------------------------
[...truncated 1.55 MB...]
19/12/08 06:13:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:34663
19/12/08 06:13:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 06:13:29 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 06:13:29 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575785607.19_b22baeb9-aa6d-49de-90f7-74a21c6d001e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 06:13:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575785607.19', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50757', 'job_port': u'0'}
19/12/08 06:13:29 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 06:13:29 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:41547.
19/12/08 06:13:29 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 06:13:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/08 06:13:29 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 06:13:29 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:40901.
19/12/08 06:13:29 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 06:13:29 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:39839
19/12/08 06:13:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 06:13:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 06:13:29 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 06:13:29 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 06:13:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 06:13:29 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 06:13:29 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 06:13:29 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 06:13:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 06:13:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 06:13:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 06:13:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 06:13:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 06:13:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 06:13:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:34029
19/12/08 06:13:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 06:13:31 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 06:13:31 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575785607.19_b22baeb9-aa6d-49de-90f7-74a21c6d001e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 06:13:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575785607.19', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50757', 'job_port': u'0'}
19/12/08 06:13:31 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43189.
19/12/08 06:13:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43093.
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 06:13:31 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:37945
19/12/08 06:13:31 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 06:13:31 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 06:13:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 06:13:31 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 06:13:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 06:13:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 06:13:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 06:13:31 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 06:13:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 06:13:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 06:13:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:36469
19/12/08 06:13:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 06:13:31 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 06:13:31 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575785607.19_b22baeb9-aa6d-49de-90f7-74a21c6d001e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 06:13:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575785607.19', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50757', 'job_port': u'0'}
19/12/08 06:13:31 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43227.
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 06:13:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:40105.
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 06:13:31 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:40195
19/12/08 06:13:32 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 06:13:32 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 06:13:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 06:13:32 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 06:13:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 06:13:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 06:13:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 06:13:32 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 06:13:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 06:13:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 06:13:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:35251
19/12/08 06:13:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 06:13:32 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 06:13:32 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575785607.19_b22baeb9-aa6d-49de-90f7-74a21c6d001e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 06:13:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575785607.19', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50757', 'job_port': u'0'}
19/12/08 06:13:32 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:34203.
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 06:13:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:44549.
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 06:13:32 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:45315
19/12/08 06:13:32 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 06:13:32 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 06:13:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 06:13:32 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 06:13:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 06:13:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 06:13:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 06:13:33 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575785607.19_b22baeb9-aa6d-49de-90f7-74a21c6d001e finished.
19/12/08 06:13:33 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/08 06:13:33 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_618d0c82-712b-49f7-9f11-3830839d584f","basePath":"/tmp/sparktestB1VYcM"}: {}
java.io.FileNotFoundException: /tmp/sparktestB1VYcM/job_618d0c82-712b-49f7-9f11-3830839d584f/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)

# Thread: <Thread(wait_until_finish_read, started daemon 140191461324544)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-119, started daemon 140191469717248)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140192257849088)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140191443752704)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-125, started daemon 140191452145408)>

# Thread: <Thread(Thread-119, started daemon 140191469717248)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(lis# Thread: <_MainThread(MainThread, started 140192257849088)>

t(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 140191461324544)>
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575785598.81_35b40348-39b8-45d6-82a8-555efa56af67 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 316.984s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 9s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/yrykbxlrcxaj2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1724

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1724/display/redirect>

Changes:


------------------------------------------
[...truncated 1.54 MB...]
19/12/08 00:12:29 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575763948.47_a44f523c-f713-44d0-9242-c7c1db0dd3a8 on Spark master local
19/12/08 00:12:29 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/08 00:12:29 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575763948.47_a44f523c-f713-44d0-9242-c7c1db0dd3a8: Pipeline translated successfully. Computing outputs
19/12/08 00:12:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:43591
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 00:12:30 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 00:12:30 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575763948.47_a44f523c-f713-44d0-9242-c7c1db0dd3a8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575763948.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46801', 'job_port': u'0'}
19/12/08 00:12:30 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:42985.
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 258-1
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38665.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 00:12:30 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:35009
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 00:12:30 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 00:12:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 00:12:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:33805
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 00:12:30 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 00:12:30 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575763948.47_a44f523c-f713-44d0-9242-c7c1db0dd3a8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575763948.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46801', 'job_port': u'0'}
19/12/08 00:12:30 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:37689.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43347.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 00:12:30 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34991
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 00:12:30 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 00:12:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 00:12:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:31 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 00:12:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 00:12:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 00:12:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:45365
19/12/08 00:12:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 00:12:31 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 00:12:31 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575763948.47_a44f523c-f713-44d0-9242-c7c1db0dd3a8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 00:12:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575763948.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46801', 'job_port': u'0'}
19/12/08 00:12:31 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 00:12:31 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:33273.
19/12/08 00:12:31 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 00:12:31 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 00:12:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/08 00:12:31 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:33839.
19/12/08 00:12:31 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 00:12:31 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:43949
19/12/08 00:12:31 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 00:12:31 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 00:12:31 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 00:12:31 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 00:12:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:31 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 00:12:31 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 00:12:31 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 00:12:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 00:12:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 00:12:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:31 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 00:12:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 00:12:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 00:12:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:39927
19/12/08 00:12:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 00:12:32 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 00:12:32 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575763948.47_a44f523c-f713-44d0-9242-c7c1db0dd3a8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 00:12:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575763948.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46801', 'job_port': u'0'}
19/12/08 00:12:32 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 00:12:32 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:40237.
19/12/08 00:12:32 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 00:12:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/08 00:12:32 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 00:12:32 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:44083.
19/12/08 00:12:32 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 00:12:32 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:33497
19/12/08 00:12:32 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 00:12:32 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 00:12:32 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 00:12:32 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 00:12:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:32 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 00:12:32 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 00:12:32 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 00:12:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 00:12:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 00:12:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:32 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 00:12:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 00:12:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 00:12:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:44863
19/12/08 00:12:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 00:12:33 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 00:12:33 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575763948.47_a44f523c-f713-44d0-9242-c7c1db0dd3a8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 00:12:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575763948.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46801', 'job_port': u'0'}
19/12/08 00:12:33 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 00:12:33 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:39803.
19/12/08 00:12:33 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 00:12:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/08 00:12:33 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 00:12:33 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:41221.
19/12/08 00:12:33 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 00:12:33 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:40153
19/12/08 00:12:33 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 00:12:33 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 00:12:33 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 00:12:33 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 00:12:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:33 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 00:12:33 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 00:12:33 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 00:12:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 00:12:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 00:12:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:33 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575763948.47_a44f523c-f713-44d0-9242-c7c1db0dd3a8 finished.
19/12/08 00:12:33 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/08 00:12:33 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_13eaa89c-93bd-43c0-a91d-be5dff023556","basePath":"/tmp/sparktestnQuZPt"}: {}
java.io.FileNotFoundException: /tmp/sparktestnQuZPt/job_13eaa89c-93bd-43c0-a91d-be5dff023556/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
==================== Timed out after 60 seconds. ====================
    _sleep(delay)

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
# Thread: <Thread(wait_until_finish_read, started daemon 140248801605376)>

# Thread: <Thread(Thread-119, started daemon 140249285326592)>

    raise BaseException(msg)
BaseException: Timed out after 60 seconds.
# Thread: <_MainThread(MainThread, started 140250073458432)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575763940.79_a0303fdb-81c3-4f00-ab37-f25febb2ee7e failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 278.358s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 3s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/e34banjsdqosa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1723

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1723/display/redirect>

Changes:


------------------------------------------
[...truncated 1.55 MB...]
19/12/07 18:13:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:35595
19/12/07 18:13:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 18:13:35 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 18:13:35 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575742412.7_fc0501f1-bbf7-40e4-b4cd-88244992b9ac', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 18:13:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575742412.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52473', 'job_port': u'0'}
19/12/07 18:13:35 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 18:13:35 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:35353.
19/12/07 18:13:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/07 18:13:35 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 18:13:35 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 18:13:35 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:35015.
19/12/07 18:13:35 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 18:13:35 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:44811
19/12/07 18:13:35 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 18:13:35 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 18:13:35 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 18:13:35 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 18:13:35 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 18:13:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 18:13:35 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 18:13:35 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 18:13:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 18:13:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 18:13:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 18:13:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/07 18:13:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/07 18:13:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/07 18:13:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46307
19/12/07 18:13:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 18:13:36 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 18:13:36 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575742412.7_fc0501f1-bbf7-40e4-b4cd-88244992b9ac', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 18:13:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575742412.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52473', 'job_port': u'0'}
19/12/07 18:13:36 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43265.
19/12/07 18:13:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/07 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:34127.
19/12/07 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 18:13:36 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:38417
19/12/07 18:13:36 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 18:13:36 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 18:13:36 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 18:13:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 18:13:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 18:13:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 18:13:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 18:13:36 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/07 18:13:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/07 18:13:37 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/07 18:13:37 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:44457
19/12/07 18:13:37 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 18:13:37 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 18:13:37 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575742412.7_fc0501f1-bbf7-40e4-b4cd-88244992b9ac', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 18:13:37 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575742412.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52473', 'job_port': u'0'}
19/12/07 18:13:37 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:36425.
19/12/07 18:13:37 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/07 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38985.
19/12/07 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 18:13:37 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:41099
19/12/07 18:13:37 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 18:13:37 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 18:13:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 18:13:37 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 18:13:37 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 18:13:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 18:13:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 18:13:37 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/07 18:13:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/07 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/07 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:37845
19/12/07 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 18:13:38 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 18:13:38 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575742412.7_fc0501f1-bbf7-40e4-b4cd-88244992b9ac', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575742412.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52473', 'job_port': u'0'}
19/12/07 18:13:38 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:44141.
19/12/07 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 18:13:38 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/07 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:42213.
19/12/07 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 18:13:38 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:46073
19/12/07 18:13:38 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 18:13:38 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 18:13:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 18:13:38 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 18:13:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 18:13:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 18:13:38 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575742412.7_fc0501f1-bbf7-40e4-b4cd-88244992b9ac finished.
19/12/07 18:13:38 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/07 18:13:38 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_a00a2054-dd34-473a-ab28-439051d421b3","basePath":"/tmp/sparktest0dw0n8"}: {}
java.io.FileNotFoundException: /tmp/sparktest0dw0n8/job_a00a2054-dd34-473a-ab28-439051d421b3/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next

    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
# Thread: <Thread(wait_until_finish_read, started daemon 139860845262592)>

# Thread: <Thread(Thread-119, started daemon 139860828477184)>

# Thread: <_MainThread(MainThread, started 139861963613952)>
==================== Timed out after 60 seconds. ====================

    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139860811691776)>

# Thread: <Thread(Thread-125, started daemon 139860820084480)>

# Thread: <_MainThread(MainThread, started 139861963613952)>

# Thread: <Thread(Thread-119, started daemon 139860828477184)>

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 139860845262592)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575742403.89_e054d60e-1537-4859-beca-9b0524632a3d failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 317.608s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 0s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/kq6cuqisvewgu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1722

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1722/display/redirect>

Changes:


------------------------------------------
[...truncated 1.55 MB...]
19/12/07 12:13:50 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:39001
19/12/07 12:13:50 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 12:13:50 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 12:13:50 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575720827.74_7af050f8-6e26-4784-a521-52bfbbe456a6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 12:13:50 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575720827.74', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49403', 'job_port': u'0'}
19/12/07 12:13:50 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:36209.
19/12/07 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 12:13:50 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/07 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:42035.
19/12/07 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 12:13:50 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:38441
19/12/07 12:13:50 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 12:13:50 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 12:13:50 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 12:13:50 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 12:13:50 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 12:13:50 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 12:13:50 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 12:13:50 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/07 12:13:51 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/07 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/07 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:42207
19/12/07 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 12:13:51 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 12:13:51 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575720827.74_7af050f8-6e26-4784-a521-52bfbbe456a6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575720827.74', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49403', 'job_port': u'0'}
19/12/07 12:13:51 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:38437.
19/12/07 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 12:13:51 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/07 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:37521.
19/12/07 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 12:13:51 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:42193
19/12/07 12:13:51 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 12:13:51 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 12:13:51 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 12:13:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 12:13:51 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 12:13:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 12:13:51 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/07 12:13:52 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/07 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/07 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:33479
19/12/07 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 12:13:52 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 12:13:52 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575720827.74_7af050f8-6e26-4784-a521-52bfbbe456a6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575720827.74', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49403', 'job_port': u'0'}
19/12/07 12:13:52 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:46247.
19/12/07 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 12:13:52 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/07 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38005.
19/12/07 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 12:13:52 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:41935
19/12/07 12:13:52 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 12:13:52 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 12:13:52 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 12:13:52 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 12:13:52 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 12:13:52 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 12:13:52 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/07 12:13:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/07 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/07 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:38287
19/12/07 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 12:13:53 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 12:13:53 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575720827.74_7af050f8-6e26-4784-a521-52bfbbe456a6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575720827.74', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49403', 'job_port': u'0'}
19/12/07 12:13:53 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:37181.
19/12/07 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 12:13:53 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/07 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:42243.
19/12/07 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 12:13:53 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:43893
19/12/07 12:13:53 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 12:13:53 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 12:13:53 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 12:13:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 12:13:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 12:13:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 12:13:53 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575720827.74_7af050f8-6e26-4784-a521-52bfbbe456a6 finished.
19/12/07 12:13:53 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/07 12:13:53 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_7cf21e16-f232-4285-88b1-59a02824a514","basePath":"/tmp/sparktestgrspuj"}: {}
java.io.FileNotFoundException: /tmp/sparktestgrspuj/job_7cf21e16-f232-4285-88b1-59a02824a514/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139823296952064)>

    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-119, started daemon 139823288559360)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 139824085083904)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(wait_until_finish_read, started daemon 139823279904512)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-125, started daemon 139823197189888)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <_MainThread(MainThread, started 139824085083904)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(lis# Thread: <Thread(Thread-119, started daemon 139823288559360)>

t(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 139823296952064)>
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575720819.74_0b1896e5-89ab-4af0-9d70-c9e6067b7531 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 316.753s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 16s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/yj2lnh5iyppyi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1721

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1721/display/redirect>

Changes:


------------------------------------------
[...truncated 1.55 MB...]
19/12/07 06:13:09 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46745
19/12/07 06:13:09 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 06:13:09 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 06:13:09 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575699186.54_9d7f8f37-7ef8-4d49-b0e8-dd9c48af57d3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 06:13:09 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575699186.54', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59867', 'job_port': u'0'}
19/12/07 06:13:09 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 06:13:09 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:33253.
19/12/07 06:13:09 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 06:13:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/07 06:13:09 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 06:13:09 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38019.
19/12/07 06:13:09 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 06:13:09 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34831
19/12/07 06:13:09 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 06:13:09 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 06:13:09 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 06:13:09 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 06:13:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 06:13:09 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 06:13:09 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 06:13:09 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 06:13:09 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 06:13:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 06:13:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 06:13:09 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/07 06:13:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/07 06:13:10 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/07 06:13:10 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:38141
19/12/07 06:13:10 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 06:13:10 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 06:13:10 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575699186.54_9d7f8f37-7ef8-4d49-b0e8-dd9c48af57d3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 06:13:10 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575699186.54', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59867', 'job_port': u'0'}
19/12/07 06:13:10 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 06:13:10 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:42267.
19/12/07 06:13:10 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/07 06:13:10 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 06:13:10 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 06:13:10 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:33823.
19/12/07 06:13:10 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 06:13:10 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34989
19/12/07 06:13:10 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 06:13:10 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 06:13:10 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 06:13:10 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 06:13:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 06:13:10 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 06:13:10 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 06:13:10 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 06:13:10 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 06:13:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 06:13:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 06:13:10 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/07 06:13:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/07 06:13:11 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/07 06:13:11 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:44487
19/12/07 06:13:11 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 06:13:11 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 06:13:11 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575699186.54_9d7f8f37-7ef8-4d49-b0e8-dd9c48af57d3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 06:13:11 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575699186.54', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59867', 'job_port': u'0'}
19/12/07 06:13:11 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 06:13:11 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:40479.
19/12/07 06:13:11 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 06:13:11 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/07 06:13:11 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 06:13:11 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:42403.
19/12/07 06:13:11 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 06:13:11 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:33487
19/12/07 06:13:11 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 06:13:11 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 06:13:11 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 06:13:11 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 06:13:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 06:13:11 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 06:13:11 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 06:13:11 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 06:13:11 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 06:13:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 06:13:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 06:13:11 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/07 06:13:12 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/07 06:13:12 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/07 06:13:12 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:39369
19/12/07 06:13:12 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 06:13:12 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 06:13:12 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575699186.54_9d7f8f37-7ef8-4d49-b0e8-dd9c48af57d3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 06:13:12 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575699186.54', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59867', 'job_port': u'0'}
19/12/07 06:13:12 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 06:13:12 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:35437.
19/12/07 06:13:12 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 06:13:12 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 06:13:12 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/07 06:13:12 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:42401.
19/12/07 06:13:12 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 06:13:12 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:42495
19/12/07 06:13:12 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 06:13:12 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 06:13:12 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 06:13:12 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 06:13:12 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 06:13:12 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 06:13:12 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 06:13:12 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 06:13:12 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 06:13:12 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 06:13:12 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 06:13:12 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575699186.54_9d7f8f37-7ef8-4d49-b0e8-dd9c48af57d3 finished.
19/12/07 06:13:12 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/07 06:13:12 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_91f55744-e8ab-477e-b75e-878b3f3179bf","basePath":"/tmp/sparktestRu7hjr"}: {}
java.io.FileNotFoundException: /tmp/sparktestRu7hjr/job_91f55744-e8ab-477e-b75e-878b3f3179bf/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_ru==================== Timed out after 60 seconds. ====================
nner.py", line 428, in wait_until_finish

# Thread: <Thread(wait_until_finish_read, started daemon 140490094511872)>

    for state_response in self._state_stream:
# Thread: <Thread(Thread-120, started daemon 140490077726464)>

# Thread: <_MainThread(MainThread, started 140490874251008)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140489453991680)>

# Thread: <Thread(Thread-126, started daemon 140490068547328)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-120, started daemon 140490077726464)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
# Thread: <Thread(wait_until_finish_read, started daemon 140490094511872)>

# Thread: <_MainThread(MainThread, started 140490874251008)>
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575699178.11_3d5af02c-f5b7-4581-91c6-f92e3dcdbf7b failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 306.941s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 51s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/bsxs7pcdbeq2q

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1720

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1720/display/redirect>

Changes:


------------------------------------------
[...truncated 1.55 MB...]

19/12/07 00:53:28 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 00:53:28 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 00:53:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 00:53:28 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 00:53:28 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 00:53:28 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 00:53:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 00:53:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 00:53:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 00:53:28 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575680002.3_4f7f617c-b704-4dcd-8112-b156f7fbdd45 finished.
19/12/07 00:53:28 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/07 00:53:28 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_9ed91e65-dba7-4e50-a2b7-81b49249abce","basePath":"/tmp/sparktestp2mwwi"}: {}
java.io.FileNotFoundException: /tmp/sparktestp2mwwi/job_9ed91e65-dba7-4e50-a2b7-81b49249abce/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140112964871936)>

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(Thread-118, started daemon 140112956479232)>

  File "apache_beam/runners/portability/portable_ru# Thread: <_MainThread(MainThread, started 140113949738752)>
==================== Timed out after 60 seconds. ====================

nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 140112939693824)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-124, started daemon 140112931301120)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 140113949738752)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-118, started daemon 140112956479232)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(wait_until_finish_read, started daemon 140112964871936)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140113949738752)>
======================================================================
ERROR: test_pardo_unfusable_side_inputs (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 254, in test_pardo_unfusable_side_inputs
    equal_to([('a', 'a'), ('a', 'b'), ('b', 'a'), ('b', 'b')]))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/pipeline.py", line 412, in run
    if test_runner_api and self._verify_runner_api_compatible():
  File "apache_beam/pipeline.py", line 625, in _verify_runner_api_compatible
    self.visit(Visitor())
  File "apache_beam/pipeline.py", line 457, in visit
    self._root_transform().visit(visitor, self, visited)
  File "apache_beam/pipeline.py", line 850, in visit
    part.visit(visitor, pipeline, visited)
  File "apache_beam/pipeline.py", line 850, in visit
    part.visit(visitor, pipeline, visited)
  File "apache_beam/pipeline.py", line 850, in visit
    part.visit(visitor, pipeline, visited)
  File "apache_beam/pipeline.py", line 853, in visit
    visitor.visit_transform(self)
  File "apache_beam/pipeline.py", line 616, in visit_transform
    enable_trace=False),
  File "apache_beam/internal/pickler.py", line 250, in dumps
    s = dill.dumps(o)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/dill/_dill.py",> line 265, in dumps
    dump(obj, file, protocol, byref, fmode, recurse, **kwds)#, strictio)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/dill/_dill.py",> line 259, in dump
    Pickler(file, protocol, **_kwds).dump(obj)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/dill/_dill.py",> line 445, in dump
    StockPickler.dump(self, obj)
  File "/usr/lib/python2.7/pickle.py", line 224, in dump
    self.save(obj)
  File "/usr/lib/python2.7/pickle.py", line 331, in save
    self.save_reduce(obj=obj, *rv)
  File "/usr/lib/python2.7/pickle.py", line 425, in save_reduce
    save(state)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "apache_beam/internal/pickler.py", line 215, in new_save_module_dict
    return old_save_module_dict(pickler, obj)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/dill/_dill.py",> line 912, in save_module_dict
    StockPickler.save_dict(pickler, obj)
  File "/usr/lib/python2.7/pickle.py", line 655, in save_dict
    self._batch_setitems(obj.iteritems())
  File "/usr/lib/python2.7/pickle.py", line 687, in _batch_setitems
    save(v)
  File "/usr/lib/python2.7/pickle.py", line 331, in save
    self.save_reduce(obj=obj, *rv)
  File "/usr/lib/python2.7/pickle.py", line 425, in save_reduce
    save(state)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "apache_beam/internal/pickler.py", line 215, in new_save_module_dict
    return old_save_module_dict(pickler, obj)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/dill/_dill.py",> line 912, in save_module_dict
    StockPickler.save_dict(pickler, obj)
  File "/usr/lib/python2.7/pickle.py", line 655, in save_dict
    self._batch_setitems(obj.iteritems())
  File "/usr/lib/python2.7/pickle.py", line 687, in _batch_setitems
    save(v)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/dill/_dill.py",> line 1421, in save_function
    obj.__dict__), obj=obj)
  File "/usr/lib/python2.7/pickle.py", line 401, in save_reduce
    save(args)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/usr/lib/python2.7/pickle.py", line 568, in save_tuple
    save(element)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "apache_beam/internal/pickler.py", line 215, in new_save_module_dict
    return old_save_module_dict(pickler, obj)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/dill/_dill.py",> line 912, in save_module_dict
    StockPickler.save_dict(pickler, obj)
  File "/usr/lib/python2.7/pickle.py", line 655, in save_dict
    self._batch_setitems(obj.iteritems())
  File "/usr/lib/python2.7/pickle.py", line 692, in _batch_setitems
    save(v)
  File "/usr/lib/python2.7/pickle.py", line 331, in save
    self.save_reduce(obj=obj, *rv)
  File "/usr/lib/python2.7/pickle.py", line 425, in save_reduce
    save(state)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/usr/lib/python2.7/pickle.py", line 554, in save_tuple
    save(element)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "apache_beam/internal/pickler.py", line 215, in new_save_module_dict
    return old_save_module_dict(pickler, obj)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/dill/_dill.py",> line 908, in save_module_dict
    log.info("D2: <dict%s" % str(obj.__repr__).split('dict')[-1]) # obj
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575679991.35_1d0d3f27-1b3d-444c-a8dd-a4bce957d299 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 373.771s

FAILED (errors=4, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 16s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/cwt57637tzcys

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1719

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1719/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-8835] Stage artifacts to BEAM-PIPELINE dir in zip

[kcweaver] [BEAM-8835] Check for leading slash in zip file paths.


------------------------------------------
[...truncated 1.55 MB...]
19/12/06 23:32:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:38515
19/12/06 23:32:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 23:32:53 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 23:32:53 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575675171.25_4b70ead1-cbbe-4eec-b1f8-ecf7677c4c24', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 23:32:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575675171.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49495', 'job_port': u'0'}
19/12/06 23:32:53 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 23:32:53 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:34251.
19/12/06 23:32:53 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 23:32:53 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/06 23:32:53 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 23:32:53 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:34585.
19/12/06 23:32:53 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 23:32:53 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:43731
19/12/06 23:32:53 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 23:32:53 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 23:32:53 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 23:32:53 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 23:32:53 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 23:32:53 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 23:32:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 23:32:53 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 23:32:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 23:32:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 23:32:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 23:32:53 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 23:32:54 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 23:32:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 23:32:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46421
19/12/06 23:32:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 23:32:54 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 23:32:54 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575675171.25_4b70ead1-cbbe-4eec-b1f8-ecf7677c4c24', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 23:32:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575675171.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49495', 'job_port': u'0'}
19/12/06 23:32:54 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 23:32:54 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43927.
19/12/06 23:32:54 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 23:32:54 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/06 23:32:54 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 23:32:54 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38375.
19/12/06 23:32:54 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 23:32:54 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:45249
19/12/06 23:32:54 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 23:32:54 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 23:32:54 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 23:32:54 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 23:32:54 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 23:32:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 23:32:54 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 23:32:54 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 23:32:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 23:32:54 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 23:32:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 23:32:54 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 23:32:55 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 23:32:55 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 23:32:55 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:35571
19/12/06 23:32:55 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 23:32:55 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 23:32:55 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575675171.25_4b70ead1-cbbe-4eec-b1f8-ecf7677c4c24', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 23:32:55 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575675171.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49495', 'job_port': u'0'}
19/12/06 23:32:55 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 23:32:55 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:34805.
19/12/06 23:32:55 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/06 23:32:55 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 23:32:55 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 23:32:55 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:40553.
19/12/06 23:32:55 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 23:32:55 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:39035
19/12/06 23:32:55 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 23:32:55 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 23:32:55 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 23:32:55 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 23:32:55 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 23:32:55 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 23:32:55 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 23:32:55 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 23:32:55 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 23:32:55 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 23:32:55 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 23:32:55 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 23:32:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 23:32:56 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 23:32:56 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:40255
19/12/06 23:32:56 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 23:32:56 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 23:32:56 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575675171.25_4b70ead1-cbbe-4eec-b1f8-ecf7677c4c24', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 23:32:56 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575675171.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49495', 'job_port': u'0'}
19/12/06 23:32:56 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 23:32:56 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:33531.
19/12/06 23:32:56 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 23:32:56 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 23:32:56 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 23:32:56 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:39547.
19/12/06 23:32:56 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 23:32:56 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:33261
19/12/06 23:32:56 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 23:32:56 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 23:32:56 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 23:32:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 23:32:56 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 23:32:56 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 23:32:56 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 23:32:56 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 23:32:56 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 23:32:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 23:32:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 23:32:56 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575675171.25_4b70ead1-cbbe-4eec-b1f8-ecf7677c4c24 finished.
19/12/06 23:32:56 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 23:32:56 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_9f770abf-81b2-49ea-9437-83927ccdcd6b","basePath":"/tmp/sparktestT9s7ew"}: {}
java.io.FileNotFoundException: /tmp/sparktestT9s7ew/job_9f770abf-81b2-49ea-9437-83927ccdcd6b/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

==================== Timed out after 60 seconds. ====================
======================================================================

ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140529675323136)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-119, started daemon 140529683715840)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140530463454976)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140529174046464)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-125, started daemon 140529182439168)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 140530463454976)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-119, started daemon 140529683715840)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140529675323136)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575675162.82_26fe060c-fe29-48fc-b1b7-2d1325f58c83 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 307.973s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 54s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/6wwdhdk25y4ns

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1718

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1718/display/redirect?page=changes>

Changes:

[robertwb] [BEAM-8882] Implement Impulse() for BundleBasedRunner.

[robertwb] [BEAM-8882] Make Create fn-api agnostic.

[robertwb] [BEAM-8882] Fully specify types for Create composite.

[robertwb] [BEAM-8882] Make Read fn-api agnostic.

[robertwb] [BEAM-8882] Cleanup always-on use_sdf_bounded_source option.

[robertwb] [BEAM-8882] Annotate ParDo and CombineValues operations with proto

[robertwb] [BEAM-8882] Unconditionally populate pipeline_proto_coder_id.

[robertwb] [BEAM-8882] Fix overly-sensitive tests.

[robertwb] Fix sdf tests from create.

[robertwb] [BEAM-8882] Avoid attaching unrecognized properties.

[robertwb] [BEAM-8882] Accommodations for JRH.

[robertwb] Minor cleanup.


------------------------------------------
[...truncated 1.55 MB...]
19/12/06 21:10:26 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:38201
19/12/06 21:10:26 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 21:10:26 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 21:10:26 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575666623.37_559df4e3-7cea-4b5d-b314-dd2017c85472', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 21:10:26 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575666623.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:51251', 'job_port': u'0'}
19/12/06 21:10:26 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 21:10:26 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:33141.
19/12/06 21:10:26 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 21:10:26 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 21:10:26 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/06 21:10:26 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:34331.
19/12/06 21:10:26 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 21:10:26 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:41267
19/12/06 21:10:26 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 21:10:26 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 21:10:26 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 21:10:26 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 21:10:26 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 21:10:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 21:10:26 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 21:10:26 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 21:10:26 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 21:10:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 21:10:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 21:10:26 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 21:10:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 21:10:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 21:10:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:35965
19/12/06 21:10:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 21:10:27 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 21:10:27 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575666623.37_559df4e3-7cea-4b5d-b314-dd2017c85472', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 21:10:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575666623.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:51251', 'job_port': u'0'}
19/12/06 21:10:27 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 21:10:27 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:42357.
19/12/06 21:10:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/06 21:10:27 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 21:10:27 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 21:10:27 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:40239.
19/12/06 21:10:27 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 21:10:27 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:36575
19/12/06 21:10:27 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 21:10:27 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 21:10:27 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 21:10:27 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 21:10:27 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 21:10:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 21:10:27 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 21:10:27 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 21:10:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 21:10:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 21:10:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 21:10:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 21:10:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 21:10:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 21:10:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:40563
19/12/06 21:10:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 21:10:28 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 21:10:28 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575666623.37_559df4e3-7cea-4b5d-b314-dd2017c85472', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 21:10:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575666623.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:51251', 'job_port': u'0'}
19/12/06 21:10:28 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 21:10:28 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:34415.
19/12/06 21:10:28 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 21:10:28 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/06 21:10:28 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 21:10:28 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43715.
19/12/06 21:10:28 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 21:10:28 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:37109
19/12/06 21:10:28 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 21:10:28 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 21:10:28 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 21:10:28 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 21:10:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 21:10:28 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 21:10:28 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 21:10:28 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 21:10:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 21:10:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 21:10:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 21:10:28 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 21:10:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 21:10:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 21:10:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:34271
19/12/06 21:10:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 21:10:28 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 21:10:28 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575666623.37_559df4e3-7cea-4b5d-b314-dd2017c85472', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 21:10:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575666623.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:51251', 'job_port': u'0'}
19/12/06 21:10:29 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 21:10:29 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:41027.
19/12/06 21:10:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 21:10:29 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 21:10:29 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 21:10:29 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:32887.
19/12/06 21:10:29 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 21:10:29 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34709
19/12/06 21:10:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 21:10:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 21:10:29 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 21:10:29 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 21:10:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 21:10:29 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 21:10:29 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 21:10:29 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 21:10:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 21:10:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 21:10:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 21:10:29 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575666623.37_559df4e3-7cea-4b5d-b314-dd2017c85472 finished.
19/12/06 21:10:29 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 21:10:29 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_6c5c5016-73d5-4c41-b5ca-6e456a6be577","basePath":"/tmp/sparktest9RLLdl"}: {}
java.io.FileNotFoundException: /tmp/sparktest9RLLdl/job_6c5c5016-73d5-4c41-b5ca-6e456a6be577/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 139621433284352)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-120, started daemon 139621156341504)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <_MainThread(MainThread, started 139621951805184)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139620528420608)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-126, started daemon 139621146113792)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
# Thread: <Thread(Thread-120, started daemon 139621156341504)>

  File "apache_beam/pipeline.py",# Thread: <_MainThread(MainThread, started 139621951805184)>

# Thread: <Thread(wait_until_finish_read, started daemon 139621433284352)>
 line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575666614.49_8ad695cc-bbcd-4006-97ac-8582fc2befb5 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 317.889s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 47s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/3zjrh3xuz3wdg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1717

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1717/display/redirect?page=changes>

Changes:

[github] [BEAM-3865] Stronger trigger tests. (#10192)

[pabloem] Merge pull request #10236 from [BEAM-8335] Add method to

[bhulette] [BEAM-8427] Add MongoDB to SQL documentation (#10273)


------------------------------------------
[...truncated 1.57 MB...]
19/12/06 19:40:12 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:45687
19/12/06 19:40:12 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 19:40:12 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 19:40:12 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575661209.51_72651fdb-c511-4310-9cda-6ee9d11c3974', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 19:40:12 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575661209.51', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52911', 'job_port': u'0'}
19/12/06 19:40:12 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 19:40:12 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:45583.
19/12/06 19:40:12 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 19:40:12 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 19:40:12 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 19:40:12 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:41647.
19/12/06 19:40:12 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 19:40:12 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:35997
19/12/06 19:40:12 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 19:40:12 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 19:40:12 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 19:40:12 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 19:40:12 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 19:40:12 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 19:40:12 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 19:40:12 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 19:40:12 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 19:40:12 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 19:40:12 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 19:40:12 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 19:40:13 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 19:40:13 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 19:40:13 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:44189
19/12/06 19:40:13 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 19:40:13 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 19:40:13 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575661209.51_72651fdb-c511-4310-9cda-6ee9d11c3974', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 19:40:13 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575661209.51', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52911', 'job_port': u'0'}
19/12/06 19:40:13 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 19:40:13 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:42135.
19/12/06 19:40:13 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 19:40:13 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/06 19:40:13 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 19:40:13 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38083.
19/12/06 19:40:13 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 19:40:13 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34647
19/12/06 19:40:13 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 19:40:13 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 19:40:13 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 19:40:13 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 19:40:13 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 19:40:13 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 19:40:13 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 19:40:13 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 19:40:13 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 19:40:13 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 19:40:13 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 19:40:13 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 19:40:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 19:40:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 19:40:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:39045
19/12/06 19:40:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 19:40:14 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 19:40:14 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575661209.51_72651fdb-c511-4310-9cda-6ee9d11c3974', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 19:40:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575661209.51', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52911', 'job_port': u'0'}
19/12/06 19:40:14 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 19:40:14 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:39675.
19/12/06 19:40:14 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 19:40:14 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/06 19:40:14 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 19:40:14 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:40781.
19/12/06 19:40:14 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 19:40:14 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:39433
19/12/06 19:40:14 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 19:40:14 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 19:40:14 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 19:40:14 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 19:40:14 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 19:40:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 19:40:14 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 19:40:14 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 19:40:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 19:40:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 19:40:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 19:40:14 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 19:40:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 19:40:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 19:40:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:33019
19/12/06 19:40:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 19:40:15 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 19:40:15 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575661209.51_72651fdb-c511-4310-9cda-6ee9d11c3974', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 19:40:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575661209.51', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52911', 'job_port': u'0'}
19/12/06 19:40:15 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 19:40:15 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:46585.
19/12/06 19:40:15 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 19:40:15 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 19:40:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/06 19:40:15 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43997.
19/12/06 19:40:15 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 19:40:15 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:40133
19/12/06 19:40:15 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 19:40:15 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 19:40:15 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 19:40:15 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 19:40:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 19:40:15 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 19:40:15 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 19:40:15 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 19:40:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 19:40:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 19:40:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 19:40:15 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575661209.51_72651fdb-c511-4310-9cda-6ee9d11c3974 finished.
19/12/06 19:40:15 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 19:40:15 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_d6df69a8-2454-4191-869e-a94145a1f196","basePath":"/tmp/sparktesttmujUC"}: {}
java.io.FileNotFoundException: /tmp/sparktesttmujUC/job_d6df69a8-2454-4191-869e-a94145a1f196/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140075123865344)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-120, started daemon 140075115472640)>

# Thread: <_MainThread(MainThread, started 140076255598336)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140075107079936)>

# Thread: <Thread(Thread-126, started daemon 140075098687232)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-120, started daemon 140075115472640)>

# Thread: <_MainThread(MainThread, started 140076255598336)>

# Thread: <Thread(wait_until_finish_read, started daemon 140075123865344)>
======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575661198.63_d2f5041d-0931-4bd6-9b40-1f831db94f79 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 356.585s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 57s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/cpdwevcwp6el2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1716

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1716/display/redirect>

Changes:


------------------------------------------
[...truncated 1.57 MB...]
19/12/06 18:30:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:34209
19/12/06 18:30:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 18:30:17 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 18:30:17 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575657014.84_299a4f63-3b48-4938-89ac-726af1f2d688', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 18:30:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575657014.84', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56045', 'job_port': u'0'}
19/12/06 18:30:17 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 18:30:17 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:33981.
19/12/06 18:30:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 18:30:17 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 18:30:17 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 18:30:17 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43969.
19/12/06 18:30:17 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 18:30:17 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:42679
19/12/06 18:30:17 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 18:30:17 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 18:30:17 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 18:30:17 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 18:30:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 18:30:17 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 18:30:17 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 18:30:17 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 18:30:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 18:30:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 18:30:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 18:30:17 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 18:30:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 18:30:18 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 18:30:18 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:41861
19/12/06 18:30:18 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 18:30:18 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 18:30:18 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575657014.84_299a4f63-3b48-4938-89ac-726af1f2d688', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 18:30:18 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575657014.84', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56045', 'job_port': u'0'}
19/12/06 18:30:18 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 18:30:18 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:45809.
19/12/06 18:30:18 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/06 18:30:18 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 18:30:18 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 18:30:18 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:46001.
19/12/06 18:30:18 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 18:30:18 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:39725
19/12/06 18:30:18 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 18:30:18 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 18:30:18 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 18:30:18 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 18:30:18 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 18:30:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 18:30:18 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 18:30:18 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 18:30:18 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 18:30:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 18:30:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 18:30:18 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 18:30:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 18:30:19 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 18:30:19 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:35441
19/12/06 18:30:19 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 18:30:19 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 18:30:19 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575657014.84_299a4f63-3b48-4938-89ac-726af1f2d688', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 18:30:19 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575657014.84', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56045', 'job_port': u'0'}
19/12/06 18:30:19 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 18:30:19 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43757.
19/12/06 18:30:19 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 18:30:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/06 18:30:19 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 18:30:19 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:44767.
19/12/06 18:30:19 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 18:30:19 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34139
19/12/06 18:30:19 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 18:30:19 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 18:30:19 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 18:30:19 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 18:30:19 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 18:30:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 18:30:19 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 18:30:19 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 18:30:19 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 18:30:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 18:30:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 18:30:19 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 18:30:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 18:30:20 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 18:30:20 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:45767
19/12/06 18:30:20 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 18:30:20 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 18:30:20 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575657014.84_299a4f63-3b48-4938-89ac-726af1f2d688', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 18:30:20 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575657014.84', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56045', 'job_port': u'0'}
19/12/06 18:30:20 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 18:30:20 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:45905.
19/12/06 18:30:20 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 18:30:20 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/06 18:30:20 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 18:30:20 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:39533.
19/12/06 18:30:20 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 18:30:20 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:43563
19/12/06 18:30:20 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 18:30:20 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 18:30:20 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 18:30:20 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 18:30:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 18:30:20 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 18:30:20 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 18:30:20 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 18:30:20 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 18:30:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 18:30:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 18:30:20 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575657014.84_299a4f63-3b48-4938-89ac-726af1f2d688 finished.
19/12/06 18:30:20 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 18:30:20 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_d0d69e61-f0f8-49ac-8496-e0008815fd60","basePath":"/tmp/sparktesthDBrAK"}: {}
java.io.FileNotFoundException: /tmp/sparktesthDBrAK/job_d0d69e61-f0f8-49ac-8496-e0008815fd60/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
==================== Timed out after 60 seconds. ====================

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
# Thread: <Thread(wait_until_finish_read, started daemon 139855050311424)>

    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-119, started daemon 139855058704128)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 139855838443264)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139855024346880)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
# Thread: <Thread(Thread-125, started daemon 139855033001728)>

----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apach# Thread: <_MainThread(MainThread, started 139855838443264)>

e_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575657005.37_c664e3af-7229-470d-9571-49c7fa82b667 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <Thread(Thread-119, started daemon 139855058704128)>

----------------------------------------------------------------------
Ran 38 tests in 313.602s
# Thread: <Thread(wait_until_finish_read, started daemon 139855050311424)>

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 41s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/zgdbqgzvduc2u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1715

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1715/display/redirect?page=changes>

Changes:

[github] [BEAM-8882] Fully populate log messages. (#10292)


------------------------------------------
[...truncated 1.57 MB...]
19/12/06 16:44:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:36453
19/12/06 16:44:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 16:44:03 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 16:44:03 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575650640.76_8722b242-e5cb-4621-90cf-5c0dd53bfa11', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 16:44:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575650640.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53481', 'job_port': u'0'}
19/12/06 16:44:03 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 16:44:03 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:40669.
19/12/06 16:44:03 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 16:44:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 16:44:03 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 16:44:03 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:44081.
19/12/06 16:44:03 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 16:44:03 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:38891
19/12/06 16:44:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 16:44:03 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 16:44:03 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 16:44:03 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 16:44:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 16:44:03 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 16:44:03 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 16:44:03 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 16:44:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 16:44:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 16:44:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 16:44:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 16:44:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 16:44:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 16:44:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:35541
19/12/06 16:44:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 16:44:04 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 16:44:04 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575650640.76_8722b242-e5cb-4621-90cf-5c0dd53bfa11', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 16:44:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575650640.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53481', 'job_port': u'0'}
19/12/06 16:44:04 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 16:44:04 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:45303.
19/12/06 16:44:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/06 16:44:04 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 16:44:04 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 16:44:04 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:46073.
19/12/06 16:44:04 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 16:44:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 16:44:04 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:33657
19/12/06 16:44:04 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 16:44:04 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 16:44:04 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 16:44:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 16:44:04 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 16:44:04 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 16:44:04 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 16:44:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 16:44:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 16:44:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 16:44:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 16:44:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 16:44:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 16:44:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:32887
19/12/06 16:44:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 16:44:04 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 16:44:04 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575650640.76_8722b242-e5cb-4621-90cf-5c0dd53bfa11', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 16:44:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575650640.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53481', 'job_port': u'0'}
19/12/06 16:44:05 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:33489.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:37463.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 16:44:05 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:39629
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 16:44:05 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 16:44:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 16:44:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 16:44:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 16:44:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 16:44:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:36069
19/12/06 16:44:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 16:44:05 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 16:44:05 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575650640.76_8722b242-e5cb-4621-90cf-5c0dd53bfa11', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 16:44:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575650640.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53481', 'job_port': u'0'}
19/12/06 16:44:05 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:46315.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43601.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 16:44:05 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:46643
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 16:44:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 16:44:05 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 16:44:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 16:44:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 16:44:05 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575650640.76_8722b242-e5cb-4621-90cf-5c0dd53bfa11 finished.
19/12/06 16:44:05 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 16:44:05 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_5fb2d848-b25b-4446-a59c-ca30492a1ff3","basePath":"/tmp/sparktestZ5fHLe"}: {}
java.io.FileNotFoundException: /tmp/sparktestZ5fHLe/job_5fb2d848-b25b-4446-a59c-ca30492a1ff3/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
==================== Timed out after 60 seconds. ====================
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 139975643948800)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-119, started daemon 139975627163392)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 139976423687936)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(wait_until_finish_read, started daemon 139975610115840)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-125, started daemon 139975618770688)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(lis# Thread: <Thread(Thread-119, started daemon 139975627163392)>

t(''.join(data))))
# Thread: <Thread(wait_until_finish_read, started daemon 139975643948800)>

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <_MainThread(MainThread, started 139976423687936)>
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575650631.64_e514e5ff-cb86-4928-bd5b-1691c301aa08 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 299.131s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 38s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://scans.gradle.com/s/nqxdsorp4qtic

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1714

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1714/display/redirect?page=changes>

Changes:

[thw] [BEAM-8815] Define the no artifacts retrieval token in proto


------------------------------------------
[...truncated 1.32 MB...]
19/12/06 15:02:39 INFO sdk_worker_main.start: Status HTTP server running at localhost:44801
19/12/06 15:02:39 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 15:02:39 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 15:02:39 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575644557.22_32b1c34b-42e2-4712-8a4c-938886205b3a', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 15:02:39 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575644557.22', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39481', 'job_port': u'0'}
19/12/06 15:02:39 INFO statecache.__init__: Creating state cache with size 0
19/12/06 15:02:39 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35543.
19/12/06 15:02:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 15:02:39 INFO sdk_worker.__init__: Control channel established.
19/12/06 15:02:39 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 15:02:39 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36629.
19/12/06 15:02:39 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 15:02:39 INFO data_plane.create_data_channel: Creating client data channel for localhost:39027
19/12/06 15:02:39 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 15:02:39 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 15:02:39 INFO sdk_worker.run: No more requests from control plane
19/12/06 15:02:39 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 15:02:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 15:02:39 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 15:02:39 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 15:02:39 INFO sdk_worker.run: Done consuming work.
19/12/06 15:02:39 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 15:02:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 15:02:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 15:02:39 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 15:02:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 15:02:40 INFO sdk_worker_main.main: Logging handler created.
19/12/06 15:02:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:40181
19/12/06 15:02:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 15:02:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 15:02:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575644557.22_32b1c34b-42e2-4712-8a4c-938886205b3a', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 15:02:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575644557.22', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39481', 'job_port': u'0'}
19/12/06 15:02:40 INFO statecache.__init__: Creating state cache with size 0
19/12/06 15:02:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33239.
19/12/06 15:02:40 INFO sdk_worker.__init__: Control channel established.
19/12/06 15:02:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 15:02:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/06 15:02:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44197.
19/12/06 15:02:40 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 15:02:40 INFO data_plane.create_data_channel: Creating client data channel for localhost:34911
19/12/06 15:02:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 15:02:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 15:02:40 INFO sdk_worker.run: No more requests from control plane
19/12/06 15:02:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 15:02:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 15:02:40 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 15:02:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 15:02:40 INFO sdk_worker.run: Done consuming work.
19/12/06 15:02:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 15:02:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 15:02:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 15:02:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 15:02:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 15:02:41 INFO sdk_worker_main.main: Logging handler created.
19/12/06 15:02:41 INFO sdk_worker_main.start: Status HTTP server running at localhost:40339
19/12/06 15:02:41 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 15:02:41 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 15:02:41 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575644557.22_32b1c34b-42e2-4712-8a4c-938886205b3a', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 15:02:41 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575644557.22', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39481', 'job_port': u'0'}
19/12/06 15:02:41 INFO statecache.__init__: Creating state cache with size 0
19/12/06 15:02:41 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39205.
19/12/06 15:02:41 INFO sdk_worker.__init__: Control channel established.
19/12/06 15:02:41 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/06 15:02:41 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 15:02:41 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40931.
19/12/06 15:02:41 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 15:02:41 INFO data_plane.create_data_channel: Creating client data channel for localhost:38511
19/12/06 15:02:41 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 15:02:41 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 15:02:41 INFO sdk_worker.run: No more requests from control plane
19/12/06 15:02:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 15:02:41 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 15:02:41 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 15:02:41 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 15:02:41 INFO sdk_worker.run: Done consuming work.
19/12/06 15:02:41 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 15:02:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 15:02:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 15:02:41 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 15:02:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 15:02:42 INFO sdk_worker_main.main: Logging handler created.
19/12/06 15:02:42 INFO sdk_worker_main.start: Status HTTP server running at localhost:42647
19/12/06 15:02:42 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 15:02:42 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 15:02:42 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575644557.22_32b1c34b-42e2-4712-8a4c-938886205b3a', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 15:02:42 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575644557.22', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39481', 'job_port': u'0'}
19/12/06 15:02:42 INFO statecache.__init__: Creating state cache with size 0
19/12/06 15:02:42 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36887.
19/12/06 15:02:42 INFO sdk_worker.__init__: Control channel established.
19/12/06 15:02:42 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 15:02:42 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/06 15:02:42 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46071.
19/12/06 15:02:42 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 15:02:42 INFO data_plane.create_data_channel: Creating client data channel for localhost:46393
19/12/06 15:02:42 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 15:02:42 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 15:02:42 INFO sdk_worker.run: No more requests from control plane
19/12/06 15:02:42 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 15:02:42 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 15:02:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 15:02:42 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 15:02:42 INFO sdk_worker.run: Done consuming work.
19/12/06 15:02:42 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 15:02:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 15:02:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 15:02:42 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575644557.22_32b1c34b-42e2-4712-8a4c-938886205b3a finished.
19/12/06 15:02:42 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 15:02:42 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_d550efff-4571-492d-97fc-c587de27aa8f","basePath":"/tmp/sparktestEq_moi"}: {}
java.io.FileNotFoundException: /tmp/sparktestEq_moi/job_d550efff-4571-492d-97fc-c587de27aa8f/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139883444164352)>

# Thread: <Thread(Thread-118, started daemon 139883452557056)>

# Thread: <_MainThread(MainThread, started 139884578526976)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 139883427378944)>

# Thread: <Thread(Thread-124, started daemon 139883435771648)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-118, started daemon 139883452557056)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139883444164352)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
# Thread: <_MainThread(MainThread, started 139884578526976)>
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575644548.36_5f789389-1546-47c2-8c0f-2c10dfb6c283 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 292.611s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 42s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/khdttcvzjbyo4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1713

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1713/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/06 12:13:01 INFO sdk_worker_main.start: Status HTTP server running at localhost:35033
19/12/06 12:13:01 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 12:13:01 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 12:13:01 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575634379.46_c7bc8eb9-7b79-4bc4-9972-ddd0ea5e8eb1', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 12:13:01 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575634379.46', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58649', 'job_port': u'0'}
19/12/06 12:13:01 INFO statecache.__init__: Creating state cache with size 0
19/12/06 12:13:01 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37445.
19/12/06 12:13:01 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 12:13:01 INFO sdk_worker.__init__: Control channel established.
19/12/06 12:13:01 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 12:13:01 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44233.
19/12/06 12:13:01 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 12:13:01 INFO data_plane.create_data_channel: Creating client data channel for localhost:37487
19/12/06 12:13:01 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 12:13:01 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 12:13:01 INFO sdk_worker.run: No more requests from control plane
19/12/06 12:13:01 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 12:13:01 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 12:13:01 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 12:13:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 12:13:01 INFO sdk_worker.run: Done consuming work.
19/12/06 12:13:01 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 12:13:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 12:13:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 12:13:02 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 12:13:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 12:13:02 INFO sdk_worker_main.main: Logging handler created.
19/12/06 12:13:02 INFO sdk_worker_main.start: Status HTTP server running at localhost:35471
19/12/06 12:13:02 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 12:13:02 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 12:13:02 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575634379.46_c7bc8eb9-7b79-4bc4-9972-ddd0ea5e8eb1', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 12:13:02 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575634379.46', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58649', 'job_port': u'0'}
19/12/06 12:13:02 INFO statecache.__init__: Creating state cache with size 0
19/12/06 12:13:02 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40435.
19/12/06 12:13:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/06 12:13:02 INFO sdk_worker.__init__: Control channel established.
19/12/06 12:13:02 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 12:13:02 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38373.
19/12/06 12:13:02 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 12:13:02 INFO data_plane.create_data_channel: Creating client data channel for localhost:38359
19/12/06 12:13:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 12:13:02 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 12:13:02 INFO sdk_worker.run: No more requests from control plane
19/12/06 12:13:02 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 12:13:02 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 12:13:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 12:13:02 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 12:13:02 INFO sdk_worker.run: Done consuming work.
19/12/06 12:13:02 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 12:13:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 12:13:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 12:13:02 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 12:13:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 12:13:03 INFO sdk_worker_main.main: Logging handler created.
19/12/06 12:13:03 INFO sdk_worker_main.start: Status HTTP server running at localhost:36059
19/12/06 12:13:03 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 12:13:03 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 12:13:03 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575634379.46_c7bc8eb9-7b79-4bc4-9972-ddd0ea5e8eb1', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 12:13:03 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575634379.46', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58649', 'job_port': u'0'}
19/12/06 12:13:03 INFO statecache.__init__: Creating state cache with size 0
19/12/06 12:13:03 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36479.
19/12/06 12:13:03 INFO sdk_worker.__init__: Control channel established.
19/12/06 12:13:03 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 12:13:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/06 12:13:03 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37909.
19/12/06 12:13:03 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 12:13:03 INFO data_plane.create_data_channel: Creating client data channel for localhost:36609
19/12/06 12:13:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 12:13:03 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 12:13:03 INFO sdk_worker.run: No more requests from control plane
19/12/06 12:13:03 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 12:13:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 12:13:03 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 12:13:03 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 12:13:03 INFO sdk_worker.run: Done consuming work.
19/12/06 12:13:03 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 12:13:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 12:13:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 12:13:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 12:13:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 12:13:04 INFO sdk_worker_main.main: Logging handler created.
19/12/06 12:13:04 INFO sdk_worker_main.start: Status HTTP server running at localhost:45571
19/12/06 12:13:04 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 12:13:04 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 12:13:04 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575634379.46_c7bc8eb9-7b79-4bc4-9972-ddd0ea5e8eb1', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 12:13:04 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575634379.46', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58649', 'job_port': u'0'}
19/12/06 12:13:04 INFO statecache.__init__: Creating state cache with size 0
19/12/06 12:13:04 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38375.
19/12/06 12:13:04 INFO sdk_worker.__init__: Control channel established.
19/12/06 12:13:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/06 12:13:04 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 12:13:04 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34801.
19/12/06 12:13:04 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 12:13:04 INFO data_plane.create_data_channel: Creating client data channel for localhost:36575
19/12/06 12:13:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 12:13:04 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 12:13:04 INFO sdk_worker.run: No more requests from control plane
19/12/06 12:13:04 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 12:13:04 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 12:13:04 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 12:13:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 12:13:04 INFO sdk_worker.run: Done consuming work.
19/12/06 12:13:04 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 12:13:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 12:13:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 12:13:04 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575634379.46_c7bc8eb9-7b79-4bc4-9972-ddd0ea5e8eb1 finished.
19/12/06 12:13:04 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 12:13:04 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_03524732-eacf-494a-b758-0737663708e5","basePath":"/tmp/sparktestlqLG_J"}: {}
java.io.FileNotFoundException: /tmp/sparktestlqLG_J/job_03524732-eacf-494a-b758-0737663708e5/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
==================== Timed out after 60 seconds. ====================

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_ru# Thread: <Thread(wait_until_finish_read, started daemon 140124411623168)>

# Thread: <Thread(Thread-119, started daemon 140124394837760)>

nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 140125191362304)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 140123903616768)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-125, started daemon 140123912009472)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-119, started daemon 140124394837760)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 140124411623168)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
# Thread: <_MainThread(MainThread, started 140125191362304)>
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575634370.77_c91c082a-36ac-410e-9288-90659c1f0960 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 293.282s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 35s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/xoj7shvaoc5xm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1712

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1712/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/06 06:13:04 INFO sdk_worker_main.start: Status HTTP server running at localhost:36343
19/12/06 06:13:04 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 06:13:04 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 06:13:04 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575612781.21_a7ff636c-d605-487d-98d5-925b6eadc0aa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 06:13:04 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575612781.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37549', 'job_port': u'0'}
19/12/06 06:13:04 INFO statecache.__init__: Creating state cache with size 0
19/12/06 06:13:04 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45557.
19/12/06 06:13:04 INFO sdk_worker.__init__: Control channel established.
19/12/06 06:13:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 06:13:04 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 06:13:04 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41109.
19/12/06 06:13:04 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 06:13:04 INFO data_plane.create_data_channel: Creating client data channel for localhost:44495
19/12/06 06:13:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 06:13:04 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 06:13:04 INFO sdk_worker.run: No more requests from control plane
19/12/06 06:13:04 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 06:13:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 06:13:04 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 06:13:04 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 06:13:04 INFO sdk_worker.run: Done consuming work.
19/12/06 06:13:04 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 06:13:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 06:13:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 06:13:05 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 06:13:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 06:13:05 INFO sdk_worker_main.main: Logging handler created.
19/12/06 06:13:05 INFO sdk_worker_main.start: Status HTTP server running at localhost:36869
19/12/06 06:13:05 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 06:13:05 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 06:13:05 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575612781.21_a7ff636c-d605-487d-98d5-925b6eadc0aa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 06:13:05 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575612781.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37549', 'job_port': u'0'}
19/12/06 06:13:05 INFO statecache.__init__: Creating state cache with size 0
19/12/06 06:13:05 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35999.
19/12/06 06:13:05 INFO sdk_worker.__init__: Control channel established.
19/12/06 06:13:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/06 06:13:05 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 06:13:05 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41013.
19/12/06 06:13:05 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 06:13:05 INFO data_plane.create_data_channel: Creating client data channel for localhost:38953
19/12/06 06:13:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 06:13:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 06:13:05 INFO sdk_worker.run: No more requests from control plane
19/12/06 06:13:05 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 06:13:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 06:13:05 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 06:13:05 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 06:13:05 INFO sdk_worker.run: Done consuming work.
19/12/06 06:13:05 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 06:13:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 06:13:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 06:13:06 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 06:13:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 06:13:06 INFO sdk_worker_main.main: Logging handler created.
19/12/06 06:13:06 INFO sdk_worker_main.start: Status HTTP server running at localhost:43325
19/12/06 06:13:06 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 06:13:06 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 06:13:06 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575612781.21_a7ff636c-d605-487d-98d5-925b6eadc0aa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 06:13:06 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575612781.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37549', 'job_port': u'0'}
19/12/06 06:13:06 INFO statecache.__init__: Creating state cache with size 0
19/12/06 06:13:06 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44949.
19/12/06 06:13:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/06 06:13:06 INFO sdk_worker.__init__: Control channel established.
19/12/06 06:13:06 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 06:13:06 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45649.
19/12/06 06:13:06 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 06:13:06 INFO data_plane.create_data_channel: Creating client data channel for localhost:41315
19/12/06 06:13:06 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 06:13:06 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 06:13:06 INFO sdk_worker.run: No more requests from control plane
19/12/06 06:13:06 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 06:13:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 06:13:06 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 06:13:06 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 06:13:06 INFO sdk_worker.run: Done consuming work.
19/12/06 06:13:06 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 06:13:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 06:13:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 06:13:06 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 06:13:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 06:13:07 INFO sdk_worker_main.main: Logging handler created.
19/12/06 06:13:07 INFO sdk_worker_main.start: Status HTTP server running at localhost:34165
19/12/06 06:13:07 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 06:13:07 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 06:13:07 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575612781.21_a7ff636c-d605-487d-98d5-925b6eadc0aa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 06:13:07 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575612781.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37549', 'job_port': u'0'}
19/12/06 06:13:07 INFO statecache.__init__: Creating state cache with size 0
19/12/06 06:13:07 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39567.
19/12/06 06:13:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/06 06:13:07 INFO sdk_worker.__init__: Control channel established.
19/12/06 06:13:07 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 06:13:07 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33725.
19/12/06 06:13:07 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 06:13:07 INFO data_plane.create_data_channel: Creating client data channel for localhost:46585
19/12/06 06:13:07 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 06:13:07 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 06:13:07 INFO sdk_worker.run: No more requests from control plane
19/12/06 06:13:07 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 06:13:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 06:13:07 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 06:13:07 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 06:13:07 INFO sdk_worker.run: Done consuming work.
19/12/06 06:13:07 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 06:13:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 06:13:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 06:13:07 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575612781.21_a7ff636c-d605-487d-98d5-925b6eadc0aa finished.
19/12/06 06:13:07 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 06:13:07 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_5900e238-b9e9-4f4c-9664-f57ca949ab78","basePath":"/tmp/sparktestAV69Ci"}: {}
java.io.FileNotFoundException: /tmp/sparktestAV69Ci/job_5900e238-b9e9-4f4c-9664-f57ca949ab78/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
==================== Timed out after 60 seconds. ====================
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================

ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
# Thread: <Thread(wait_until_finish_read, started daemon 140092687054592)>

Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(Thread-119, started daemon 140092678661888)>

  File "apache_beam/runners/portability/portable_ru# Thread: <_MainThread(MainThread, started 140093466793728)>
==================== Timed out after 60 seconds. ====================

nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 140092652173056)>

# Thread: <Thread(Thread-125, started daemon 140092660565760)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-119, started daemon 140092678661888)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 140093466793728)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 140092687054592)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575612770.33_024dc405-19d1-42e4-83b6-527821825d8c failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 310.838s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 48s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/drkdrb4r7ofsc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1711

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1711/display/redirect?page=changes>

Changes:

[dcavazos] [BEAM-7390] Add code snippet for Sample


------------------------------------------
[...truncated 1.32 MB...]
19/12/06 04:11:24 INFO sdk_worker_main.start: Status HTTP server running at localhost:40705
19/12/06 04:11:24 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 04:11:24 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 04:11:24 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575605481.94_d73259ad-0335-4aff-abe3-c53837469793', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 04:11:24 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575605481.94', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35249', 'job_port': u'0'}
19/12/06 04:11:24 INFO statecache.__init__: Creating state cache with size 0
19/12/06 04:11:24 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35759.
19/12/06 04:11:24 INFO sdk_worker.__init__: Control channel established.
19/12/06 04:11:24 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 04:11:24 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 04:11:24 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42881.
19/12/06 04:11:24 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 04:11:24 INFO data_plane.create_data_channel: Creating client data channel for localhost:35091
19/12/06 04:11:24 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 04:11:24 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 04:11:24 INFO sdk_worker.run: No more requests from control plane
19/12/06 04:11:24 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 04:11:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 04:11:24 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 04:11:24 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 04:11:24 INFO sdk_worker.run: Done consuming work.
19/12/06 04:11:24 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 04:11:24 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 04:11:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 04:11:24 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 04:11:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 04:11:25 INFO sdk_worker_main.main: Logging handler created.
19/12/06 04:11:25 INFO sdk_worker_main.start: Status HTTP server running at localhost:38703
19/12/06 04:11:25 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 04:11:25 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 04:11:25 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575605481.94_d73259ad-0335-4aff-abe3-c53837469793', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 04:11:25 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575605481.94', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35249', 'job_port': u'0'}
19/12/06 04:11:25 INFO statecache.__init__: Creating state cache with size 0
19/12/06 04:11:25 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35411.
19/12/06 04:11:25 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/06 04:11:25 INFO sdk_worker.__init__: Control channel established.
19/12/06 04:11:25 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 04:11:25 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41051.
19/12/06 04:11:25 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 04:11:25 INFO data_plane.create_data_channel: Creating client data channel for localhost:38473
19/12/06 04:11:25 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 04:11:25 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 04:11:25 INFO sdk_worker.run: No more requests from control plane
19/12/06 04:11:25 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 04:11:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 04:11:25 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 04:11:25 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 04:11:25 INFO sdk_worker.run: Done consuming work.
19/12/06 04:11:25 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 04:11:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 04:11:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 04:11:25 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 04:11:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 04:11:26 INFO sdk_worker_main.main: Logging handler created.
19/12/06 04:11:26 INFO sdk_worker_main.start: Status HTTP server running at localhost:41785
19/12/06 04:11:26 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 04:11:26 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 04:11:26 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575605481.94_d73259ad-0335-4aff-abe3-c53837469793', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 04:11:26 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575605481.94', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35249', 'job_port': u'0'}
19/12/06 04:11:26 INFO statecache.__init__: Creating state cache with size 0
19/12/06 04:11:26 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34471.
19/12/06 04:11:26 INFO sdk_worker.__init__: Control channel established.
19/12/06 04:11:26 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 04:11:26 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/06 04:11:26 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39499.
19/12/06 04:11:26 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 04:11:26 INFO data_plane.create_data_channel: Creating client data channel for localhost:34769
19/12/06 04:11:26 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 04:11:26 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 04:11:26 INFO sdk_worker.run: No more requests from control plane
19/12/06 04:11:26 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 04:11:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 04:11:26 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 04:11:26 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 04:11:26 INFO sdk_worker.run: Done consuming work.
19/12/06 04:11:26 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 04:11:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 04:11:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 04:11:26 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 04:11:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 04:11:27 INFO sdk_worker_main.main: Logging handler created.
19/12/06 04:11:27 INFO sdk_worker_main.start: Status HTTP server running at localhost:46431
19/12/06 04:11:27 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 04:11:27 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 04:11:27 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575605481.94_d73259ad-0335-4aff-abe3-c53837469793', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 04:11:27 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575605481.94', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35249', 'job_port': u'0'}
19/12/06 04:11:27 INFO statecache.__init__: Creating state cache with size 0
19/12/06 04:11:27 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34043.
19/12/06 04:11:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/06 04:11:27 INFO sdk_worker.__init__: Control channel established.
19/12/06 04:11:27 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 04:11:27 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37589.
19/12/06 04:11:27 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 04:11:27 INFO data_plane.create_data_channel: Creating client data channel for localhost:46277
19/12/06 04:11:27 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 04:11:27 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 04:11:27 INFO sdk_worker.run: No more requests from control plane
19/12/06 04:11:27 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 04:11:27 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 04:11:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 04:11:27 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 04:11:27 INFO sdk_worker.run: Done consuming work.
19/12/06 04:11:27 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 04:11:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 04:11:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 04:11:27 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575605481.94_d73259ad-0335-4aff-abe3-c53837469793 finished.
19/12/06 04:11:27 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 04:11:27 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_c76589f2-c316-4631-8700-6c8d6af4f466","basePath":"/tmp/sparktestJgovL2"}: {}
java.io.FileNotFoundException: /tmp/sparktestJgovL2/job_c76589f2-c316-4631-8700-6c8d6af4f466/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139846131635968)>

# Thread: <Thread(Thread-118, started daemon 139846140028672)>

# Thread: <_MainThread(MainThread, started 139847121999616)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139846106457856)>

# Thread: <Thread(Thread-124, started daemon 139846114850560)>

# Thread: <Thread(Thread-118, started daemon 139846140028672)>

  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575605471.94_fed4c879-3b71-4d17-80cb-f0a593187bb8 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 343.016s

FAILED (errors=3, skipped=9)
# Thread: <_MainThread(MainThread, started 139847121999616)>

# Thread: <Thread(wait_until_finish_read, started daemon 139846131635968)>

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 28s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/jnq5e2qoft5uc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1710

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1710/display/redirect?page=changes>

Changes:

[pabloem] Reactivating test while preventing timeouts.


------------------------------------------
[...truncated 1.32 MB...]
19/12/06 01:32:43 INFO sdk_worker_main.start: Status HTTP server running at localhost:44509
19/12/06 01:32:43 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 01:32:43 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 01:32:43 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575595960.71_fe0bf38e-ebf1-48c2-bc54-a93860f22f0c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 01:32:43 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575595960.71', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55489', 'job_port': u'0'}
19/12/06 01:32:43 INFO statecache.__init__: Creating state cache with size 0
19/12/06 01:32:43 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41993.
19/12/06 01:32:43 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 01:32:43 INFO sdk_worker.__init__: Control channel established.
19/12/06 01:32:43 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 01:32:43 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33509.
19/12/06 01:32:43 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 01:32:43 INFO data_plane.create_data_channel: Creating client data channel for localhost:44547
19/12/06 01:32:43 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 01:32:43 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 01:32:43 INFO sdk_worker.run: No more requests from control plane
19/12/06 01:32:43 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 01:32:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 01:32:43 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 01:32:43 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 01:32:43 INFO sdk_worker.run: Done consuming work.
19/12/06 01:32:43 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 01:32:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 01:32:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 01:32:43 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 01:32:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 01:32:44 INFO sdk_worker_main.main: Logging handler created.
19/12/06 01:32:44 INFO sdk_worker_main.start: Status HTTP server running at localhost:43941
19/12/06 01:32:44 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 01:32:44 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 01:32:44 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575595960.71_fe0bf38e-ebf1-48c2-bc54-a93860f22f0c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 01:32:44 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575595960.71', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55489', 'job_port': u'0'}
19/12/06 01:32:44 INFO statecache.__init__: Creating state cache with size 0
19/12/06 01:32:44 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39307.
19/12/06 01:32:44 INFO sdk_worker.__init__: Control channel established.
19/12/06 01:32:44 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 01:32:44 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/06 01:32:44 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34719.
19/12/06 01:32:44 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 01:32:44 INFO data_plane.create_data_channel: Creating client data channel for localhost:35213
19/12/06 01:32:44 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 01:32:44 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 01:32:44 INFO sdk_worker.run: No more requests from control plane
19/12/06 01:32:44 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 01:32:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 01:32:44 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 01:32:44 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 01:32:44 INFO sdk_worker.run: Done consuming work.
19/12/06 01:32:44 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 01:32:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 01:32:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 01:32:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 01:32:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 01:32:45 INFO sdk_worker_main.main: Logging handler created.
19/12/06 01:32:45 INFO sdk_worker_main.start: Status HTTP server running at localhost:38001
19/12/06 01:32:45 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 01:32:45 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 01:32:45 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575595960.71_fe0bf38e-ebf1-48c2-bc54-a93860f22f0c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 01:32:45 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575595960.71', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55489', 'job_port': u'0'}
19/12/06 01:32:45 INFO statecache.__init__: Creating state cache with size 0
19/12/06 01:32:45 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39101.
19/12/06 01:32:45 INFO sdk_worker.__init__: Control channel established.
19/12/06 01:32:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/06 01:32:45 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 01:32:45 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38075.
19/12/06 01:32:45 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 01:32:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:35437
19/12/06 01:32:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 01:32:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 01:32:45 INFO sdk_worker.run: No more requests from control plane
19/12/06 01:32:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 01:32:45 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 01:32:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 01:32:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 01:32:45 INFO sdk_worker.run: Done consuming work.
19/12/06 01:32:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 01:32:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 01:32:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 01:32:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 01:32:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 01:32:46 INFO sdk_worker_main.main: Logging handler created.
19/12/06 01:32:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:41533
19/12/06 01:32:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 01:32:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 01:32:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575595960.71_fe0bf38e-ebf1-48c2-bc54-a93860f22f0c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 01:32:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575595960.71', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55489', 'job_port': u'0'}
19/12/06 01:32:46 INFO statecache.__init__: Creating state cache with size 0
19/12/06 01:32:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42749.
19/12/06 01:32:46 INFO sdk_worker.__init__: Control channel established.
19/12/06 01:32:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/06 01:32:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 01:32:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42937.
19/12/06 01:32:46 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 01:32:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:35241
19/12/06 01:32:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 01:32:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 01:32:46 INFO sdk_worker.run: No more requests from control plane
19/12/06 01:32:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 01:32:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 01:32:46 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 01:32:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 01:32:46 INFO sdk_worker.run: Done consuming work.
19/12/06 01:32:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 01:32:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 01:32:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 01:32:46 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575595960.71_fe0bf38e-ebf1-48c2-bc54-a93860f22f0c finished.
19/12/06 01:32:46 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 01:32:46 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_2aeba5fd-7e7a-4733-ab9f-79a4263c2f12","basePath":"/tmp/sparktest6aBpvu"}: {}
java.io.FileNotFoundException: /tmp/sparktest6aBpvu/job_2aeba5fd-7e7a-4733-ab9f-79a4263c2f12/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
==================== Timed out after 60 seconds. ====================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)

# Thread: <Thread(wait_until_finish_read, started daemon 139858043000576)>
----------------------------------------------------------------------
Traceback (most recent call last):

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
# Thread: <Thread(Thread-119, started daemon 139857682753280)>

# Thread: <_MainThread(MainThread, started 139858822739712)>
    self.run().wait_until_finish()
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 139857674360576)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-125, started daemon 139857665967872)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
# Thread: <_MainThread(MainThread, started 139858822739712)>

    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-119, started daemon 139857682753280)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait

# Thread: <Thread(wait_until_finish_read, started daemon 139858043000576)>
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575595950.1_10c6bc26-7849-46c4-98f7-97b60862db9c failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 340.678s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 24s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://scans.gradle.com/s/fujyacowm4com

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1709

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1709/display/redirect?page=changes>

Changes:

[rohde.samuel] fix assert equals_to_per_window to actually assert window's existence

[robertwb] Fix [BEAM-8581] and [BEAM-8582]


------------------------------------------
[...truncated 1.32 MB...]
19/12/06 00:21:34 INFO sdk_worker_main.start: Status HTTP server running at localhost:38575
19/12/06 00:21:34 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 00:21:34 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 00:21:34 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575591691.65_2c0c943b-ff73-4b92-a6b2-9736fe06329b', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 00:21:34 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575591691.65', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49761', 'job_port': u'0'}
19/12/06 00:21:34 INFO statecache.__init__: Creating state cache with size 0
19/12/06 00:21:34 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37103.
19/12/06 00:21:34 INFO sdk_worker.__init__: Control channel established.
19/12/06 00:21:34 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 00:21:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 00:21:34 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38737.
19/12/06 00:21:34 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 00:21:34 INFO data_plane.create_data_channel: Creating client data channel for localhost:37769
19/12/06 00:21:34 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 00:21:34 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 00:21:34 INFO sdk_worker.run: No more requests from control plane
19/12/06 00:21:34 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 00:21:34 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 00:21:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 00:21:34 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 00:21:34 INFO sdk_worker.run: Done consuming work.
19/12/06 00:21:34 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 00:21:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 00:21:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 00:21:34 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 00:21:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 00:21:34 INFO sdk_worker_main.main: Logging handler created.
19/12/06 00:21:34 INFO sdk_worker_main.start: Status HTTP server running at localhost:35327
19/12/06 00:21:34 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 00:21:34 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 00:21:34 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575591691.65_2c0c943b-ff73-4b92-a6b2-9736fe06329b', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 00:21:34 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575591691.65', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49761', 'job_port': u'0'}
19/12/06 00:21:34 INFO statecache.__init__: Creating state cache with size 0
19/12/06 00:21:34 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43735.
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/06 00:21:35 INFO sdk_worker.__init__: Control channel established.
19/12/06 00:21:35 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 00:21:35 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41325.
19/12/06 00:21:35 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 00:21:35 INFO data_plane.create_data_channel: Creating client data channel for localhost:35129
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 00:21:35 INFO sdk_worker.run: No more requests from control plane
19/12/06 00:21:35 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 00:21:35 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 00:21:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 00:21:35 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 00:21:35 INFO sdk_worker.run: Done consuming work.
19/12/06 00:21:35 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 00:21:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 00:21:35 INFO sdk_worker_main.main: Logging handler created.
19/12/06 00:21:35 INFO sdk_worker_main.start: Status HTTP server running at localhost:40045
19/12/06 00:21:35 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 00:21:35 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 00:21:35 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575591691.65_2c0c943b-ff73-4b92-a6b2-9736fe06329b', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 00:21:35 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575591691.65', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49761', 'job_port': u'0'}
19/12/06 00:21:35 INFO statecache.__init__: Creating state cache with size 0
19/12/06 00:21:35 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39807.
19/12/06 00:21:35 INFO sdk_worker.__init__: Control channel established.
19/12/06 00:21:35 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/06 00:21:35 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34131.
19/12/06 00:21:35 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 00:21:35 INFO data_plane.create_data_channel: Creating client data channel for localhost:35547
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 00:21:35 INFO sdk_worker.run: No more requests from control plane
19/12/06 00:21:35 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 00:21:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 00:21:35 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 00:21:35 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 00:21:35 INFO sdk_worker.run: Done consuming work.
19/12/06 00:21:35 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 00:21:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 00:21:36 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 00:21:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 00:21:36 INFO sdk_worker_main.main: Logging handler created.
19/12/06 00:21:36 INFO sdk_worker_main.start: Status HTTP server running at localhost:45025
19/12/06 00:21:36 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 00:21:36 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 00:21:36 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575591691.65_2c0c943b-ff73-4b92-a6b2-9736fe06329b', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 00:21:36 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575591691.65', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49761', 'job_port': u'0'}
19/12/06 00:21:36 INFO statecache.__init__: Creating state cache with size 0
19/12/06 00:21:36 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38377.
19/12/06 00:21:36 INFO sdk_worker.__init__: Control channel established.
19/12/06 00:21:36 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 00:21:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/06 00:21:36 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38101.
19/12/06 00:21:36 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 00:21:36 INFO data_plane.create_data_channel: Creating client data channel for localhost:42459
19/12/06 00:21:36 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 00:21:36 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 00:21:36 INFO sdk_worker.run: No more requests from control plane
19/12/06 00:21:36 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 00:21:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 00:21:36 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 00:21:36 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 00:21:36 INFO sdk_worker.run: Done consuming work.
19/12/06 00:21:36 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 00:21:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 00:21:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 00:21:36 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575591691.65_2c0c943b-ff73-4b92-a6b2-9736fe06329b finished.
19/12/06 00:21:36 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 00:21:36 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_e2f2f881-2994-4000-a47d-3ab4cb74be84","basePath":"/tmp/sparktestcztjUX"}: {}
java.io.FileNotFoundException: /tmp/sparktestcztjUX/job_e2f2f881-2994-4000-a47d-3ab4cb74be84/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
==================== Timed out after 60 seconds. ====================

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 140450396632832)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-119, started daemon 140450388240128)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140451525756672)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140450379847424)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-125, started daemon 140450371454720)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <_MainThread(MainThread, started 140451525756672)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-119, started daemon 140450388240128)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140450396632832)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575591682.57_d1d435f9-6a11-4a28-af77-d1d923fda54a failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 308.378s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 44s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/qp2lkpd26kzc4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1708

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1708/display/redirect?page=changes>

Changes:

[lcwik] [BEAM-4287] Fix to use the residual instead of the current restriction


------------------------------------------
[...truncated 1.31 MB...]
19/12/05 23:38:22 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575589101.32_48e29f14-cd99-473e-aa14-5eeb390e3b04 on Spark master local
19/12/05 23:38:22 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/05 23:38:22 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575589101.32_48e29f14-cd99-473e-aa14-5eeb390e3b04: Pipeline translated successfully. Computing outputs
19/12/05 23:38:22 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:38:23 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:38:23 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:38:23 INFO sdk_worker_main.start: Status HTTP server running at localhost:43667
19/12/05 23:38:23 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:38:23 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:38:23 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575589101.32_48e29f14-cd99-473e-aa14-5eeb390e3b04', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:38:23 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575589101.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42559', 'job_port': u'0'}
19/12/05 23:38:23 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:38:23 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35411.
19/12/05 23:38:23 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:38:23 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:38:23 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/05 23:38:23 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37577.
19/12/05 23:38:23 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:38:23 INFO data_plane.create_data_channel: Creating client data channel for localhost:45313
19/12/05 23:38:23 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:38:23 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:38:23 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:38:23 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:38:23 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:38:23 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:23 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:38:23 INFO sdk_worker.run: Done consuming work.
19/12/05 23:38:23 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:38:23 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:38:23 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:23 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:38:24 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:38:24 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:38:24 INFO sdk_worker_main.start: Status HTTP server running at localhost:33523
19/12/05 23:38:24 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:38:24 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:38:24 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575589101.32_48e29f14-cd99-473e-aa14-5eeb390e3b04', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:38:24 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575589101.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42559', 'job_port': u'0'}
19/12/05 23:38:24 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:38:24 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41079.
19/12/05 23:38:24 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/05 23:38:24 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:38:24 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:38:24 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34019.
19/12/05 23:38:24 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:38:24 INFO data_plane.create_data_channel: Creating client data channel for localhost:39405
19/12/05 23:38:24 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:38:24 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:38:24 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:38:24 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:38:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:24 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:38:24 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:38:24 INFO sdk_worker.run: Done consuming work.
19/12/05 23:38:24 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:38:24 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:38:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:24 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:38:24 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:38:24 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:38:24 INFO sdk_worker_main.start: Status HTTP server running at localhost:34771
19/12/05 23:38:24 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:38:24 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:38:24 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575589101.32_48e29f14-cd99-473e-aa14-5eeb390e3b04', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:38:24 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575589101.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42559', 'job_port': u'0'}
19/12/05 23:38:24 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:38:24 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45277.
19/12/05 23:38:24 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:38:24 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/05 23:38:24 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:38:24 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35493.
19/12/05 23:38:24 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:38:24 INFO data_plane.create_data_channel: Creating client data channel for localhost:42569
19/12/05 23:38:24 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:38:25 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:38:25 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:38:25 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:38:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:25 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:38:25 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:38:25 INFO sdk_worker.run: Done consuming work.
19/12/05 23:38:25 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:38:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:38:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:25 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:38:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:38:25 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:38:25 INFO sdk_worker_main.start: Status HTTP server running at localhost:43009
19/12/05 23:38:25 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:38:25 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:38:25 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575589101.32_48e29f14-cd99-473e-aa14-5eeb390e3b04', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:38:25 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575589101.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42559', 'job_port': u'0'}
19/12/05 23:38:25 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:38:25 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45219.
19/12/05 23:38:25 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:38:25 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:38:25 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/05 23:38:25 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37565.
19/12/05 23:38:25 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:38:25 INFO data_plane.create_data_channel: Creating client data channel for localhost:33097
19/12/05 23:38:25 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:38:25 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:38:25 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:38:25 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:38:25 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:38:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:25 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:38:25 INFO sdk_worker.run: Done consuming work.
19/12/05 23:38:25 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:38:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:38:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:25 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:38:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:38:26 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:38:26 INFO sdk_worker_main.start: Status HTTP server running at localhost:46749
19/12/05 23:38:26 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:38:26 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:38:26 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575589101.32_48e29f14-cd99-473e-aa14-5eeb390e3b04', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:38:26 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575589101.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42559', 'job_port': u'0'}
19/12/05 23:38:26 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:38:26 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34125.
19/12/05 23:38:26 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/05 23:38:26 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:38:26 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:38:26 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42031.
19/12/05 23:38:26 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:38:26 INFO data_plane.create_data_channel: Creating client data channel for localhost:39039
19/12/05 23:38:26 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:38:26 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:38:26 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:38:26 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:38:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:26 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:38:26 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:38:26 INFO sdk_worker.run: Done consuming work.
19/12/05 23:38:26 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:38:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:38:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:26 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575589101.32_48e29f14-cd99-473e-aa14-5eeb390e3b04 finished.
19/12/05 23:38:26 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/05 23:38:26 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_6a461650-c265-44c1-a82e-0e9cbfb14224","basePath":"/tmp/sparktestB38E19"}: {}
java.io.FileNotFoundException: /tmp/sparktestB38E19/job_6a461650-c265-44c1-a82e-0e9cbfb14224/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139977121392384)>

# Thread: <Thread(Thread-119, started daemon 139977129785088)>

# Thread: <_MainThread(MainThread, started 139977909524224)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575589092.24_0275e6c0-7aa0-464e-a4b1-c7ab99cff185 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 289.074s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 3s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/eefmoylgfd5cc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1707

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1707/display/redirect?page=changes>

Changes:

[chadrik] Make local job service accessible from external machines

[chadrik] Provide methods to override bind and service addresses independently

[chadrik] Fix lint


------------------------------------------
[...truncated 1.32 MB...]
19/12/05 23:24:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:24:39 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:24:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:24:40 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:24:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:33803
19/12/05 23:24:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:24:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:24:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575588277.65_ebdb9e65-542f-4335-8e09-cdc0f23e9373', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:24:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575588277.65', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36969', 'job_port': u'0'}
19/12/05 23:24:40 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:24:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36609.
19/12/05 23:24:40 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:24:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:24:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/05 23:24:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33897.
19/12/05 23:24:40 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:24:40 INFO data_plane.create_data_channel: Creating client data channel for localhost:40791
19/12/05 23:24:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:24:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:24:40 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:24:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:24:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:24:40 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:24:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:24:40 INFO sdk_worker.run: Done consuming work.
19/12/05 23:24:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:24:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:24:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:24:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:24:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:24:41 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:24:41 INFO sdk_worker_main.start: Status HTTP server running at localhost:34059
19/12/05 23:24:41 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:24:41 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:24:41 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575588277.65_ebdb9e65-542f-4335-8e09-cdc0f23e9373', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:24:41 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575588277.65', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36969', 'job_port': u'0'}
19/12/05 23:24:41 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:24:41 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44965.
19/12/05 23:24:41 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:24:41 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:24:41 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/05 23:24:41 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34333.
19/12/05 23:24:41 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:24:41 INFO data_plane.create_data_channel: Creating client data channel for localhost:46269
19/12/05 23:24:41 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:24:41 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:24:41 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:24:41 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:24:41 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:24:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:24:41 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:24:41 INFO sdk_worker.run: Done consuming work.
19/12/05 23:24:41 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:24:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:24:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:24:41 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:24:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:24:42 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:24:42 INFO sdk_worker_main.start: Status HTTP server running at localhost:39059
19/12/05 23:24:42 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:24:42 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:24:42 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575588277.65_ebdb9e65-542f-4335-8e09-cdc0f23e9373', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:24:42 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575588277.65', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36969', 'job_port': u'0'}
19/12/05 23:24:42 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:24:42 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33427.
19/12/05 23:24:42 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/05 23:24:42 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:24:42 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:24:42 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37655.
19/12/05 23:24:42 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:24:42 INFO data_plane.create_data_channel: Creating client data channel for localhost:42287
19/12/05 23:24:42 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:24:42 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:24:42 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:24:42 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:24:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:24:42 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:24:42 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:24:42 INFO sdk_worker.run: Done consuming work.
19/12/05 23:24:42 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:24:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:24:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:24:42 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:24:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:24:43 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:24:43 INFO sdk_worker_main.start: Status HTTP server running at localhost:40741
19/12/05 23:24:43 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:24:43 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:24:43 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575588277.65_ebdb9e65-542f-4335-8e09-cdc0f23e9373', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:24:43 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575588277.65', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36969', 'job_port': u'0'}
19/12/05 23:24:43 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:24:43 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44297.
19/12/05 23:24:43 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/05 23:24:43 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:24:43 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:24:43 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42911.
19/12/05 23:24:43 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:24:43 INFO data_plane.create_data_channel: Creating client data channel for localhost:33745
19/12/05 23:24:43 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:24:43 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:24:43 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:24:43 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:24:43 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:24:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:24:43 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:24:43 INFO sdk_worker.run: Done consuming work.
19/12/05 23:24:43 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:24:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:24:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:24:43 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575588277.65_ebdb9e65-542f-4335-8e09-cdc0f23e9373 finished.
19/12/05 23:24:43 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/05 23:24:43 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_04fc138d-2f80-4963-84f5-83cd44e1efbb","basePath":"/tmp/sparktest0N3sLO"}: {}
java.io.FileNotFoundException: /tmp/sparktest0N3sLO/job_04fc138d-2f80-4963-84f5-83cd44e1efbb/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
==================== Timed out after 60 seconds. ====================
  File "/usr/lib/python2.7/threading.py", line 359, in wait

    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
# Thread: <Thread(wait_until_finish_read, started daemon 140385489295104)>

# Thread: <Thread(Thread-120, started daemon 140385472509696)>

BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140386269034240)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(lis==================== Timed out after 60 seconds. ====================

t(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(wait_until_finish_read, started daemon 140385463068416)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575588268.01_c4f0e71b-49d4-424a-8f44-e1a6b2af3a63 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 300.667s

# Thread: <Thread(Thread-126, started daemon 140385454675712)>

# Thread: <_MainThread(MainThread, started 140386269034240)>
FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 4s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/myewgw62ojru6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1706

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1706/display/redirect?page=changes>

Changes:

[kirillkozlov] MongoDb project push-down, needs tests

[kirillkozlov] Add tests for MongoDb project push-down

[kirillkozlov] Added cleanup for tests

[kirillkozlov] rebase

[kirillkozlov] Check last executed query


------------------------------------------
[...truncated 1.32 MB...]
19/12/05 23:02:44 INFO sdk_worker_main.start: Status HTTP server running at localhost:45937
19/12/05 23:02:44 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:02:44 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:02:44 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575586962.28_74ff277e-b467-4278-aadf-89acb49c4c7f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:02:44 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575586962.28', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44773', 'job_port': u'0'}
19/12/05 23:02:44 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:02:44 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33335.
19/12/05 23:02:44 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/05 23:02:44 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:02:44 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:02:44 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39833.
19/12/05 23:02:44 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:02:44 INFO data_plane.create_data_channel: Creating client data channel for localhost:38181
19/12/05 23:02:44 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:02:44 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:02:44 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:02:44 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:02:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:02:44 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:02:44 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:02:44 INFO sdk_worker.run: Done consuming work.
19/12/05 23:02:44 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:02:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:02:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:02:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:02:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:02:45 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:02:45 INFO sdk_worker_main.start: Status HTTP server running at localhost:35673
19/12/05 23:02:45 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:02:45 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:02:45 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575586962.28_74ff277e-b467-4278-aadf-89acb49c4c7f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:02:45 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575586962.28', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44773', 'job_port': u'0'}
19/12/05 23:02:45 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:02:45 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43425.
19/12/05 23:02:45 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:02:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/05 23:02:45 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:02:45 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38385.
19/12/05 23:02:45 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:02:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:39393
19/12/05 23:02:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:02:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:02:45 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:02:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:02:45 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:02:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:02:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:02:45 INFO sdk_worker.run: Done consuming work.
19/12/05 23:02:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:02:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:02:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:02:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:02:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:02:46 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:02:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:37585
19/12/05 23:02:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:02:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:02:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575586962.28_74ff277e-b467-4278-aadf-89acb49c4c7f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:02:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575586962.28', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44773', 'job_port': u'0'}
19/12/05 23:02:46 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:02:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45907.
19/12/05 23:02:46 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:02:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/05 23:02:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:02:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40309.
19/12/05 23:02:46 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:02:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:02:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:38915
19/12/05 23:02:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:02:46 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:02:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:02:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:02:46 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:02:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:02:46 INFO sdk_worker.run: Done consuming work.
19/12/05 23:02:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:02:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:02:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:02:46 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:02:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:02:47 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:02:47 INFO sdk_worker_main.start: Status HTTP server running at localhost:46559
19/12/05 23:02:47 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:02:47 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:02:47 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575586962.28_74ff277e-b467-4278-aadf-89acb49c4c7f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:02:47 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575586962.28', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44773', 'job_port': u'0'}
19/12/05 23:02:47 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:02:47 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39269.
19/12/05 23:02:47 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:02:47 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:02:47 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/05 23:02:47 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34417.
19/12/05 23:02:47 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:02:47 INFO data_plane.create_data_channel: Creating client data channel for localhost:37239
19/12/05 23:02:47 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:02:47 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:02:47 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:02:47 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:02:47 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:02:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:02:47 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:02:47 INFO sdk_worker.run: Done consuming work.
19/12/05 23:02:47 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:02:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:02:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:02:47 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575586962.28_74ff277e-b467-4278-aadf-89acb49c4c7f finished.
19/12/05 23:02:47 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/05 23:02:47 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_973ef860-5f13-450e-8fdb-b55275568b20","basePath":"/tmp/sparktestImSSir"}: {}
java.io.FileNotFoundException: /tmp/sparktestImSSir/job_973ef860-5f13-450e-8fdb-b55275568b20/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140425681368832)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next

# Thread: <Thread(Thread-120, started daemon 140425664583424)>

    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 140426461107968)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 140425655666432)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-126, started daemon 140425647273728)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apach# Thread: <Thread(Thread-120, started daemon 140425664583424)>

e_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(wait_until_finish_read, started daemon 140425681368832)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575586953.04_293215c7-e50c-49fe-ae68-db900db84601 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <_MainThread(MainThread, started 140426461107968)>
----------------------------------------------------------------------
Ran 38 tests in 315.560s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 42s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/tbll5ozwksdv2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1705

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1705/display/redirect?page=changes>

Changes:

[github] Merge pull request #10278: [BEAM-7274] Support recursive type


------------------------------------------
[...truncated 1.32 MB...]
19/12/05 21:49:31 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 21:49:31 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 21:49:31 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575582568.24_18838da5-f0c0-4a56-84b1-61ef8c859b20', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 21:49:31 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575582568.24', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60929', 'job_port': u'0'}
19/12/05 21:49:31 INFO statecache.__init__: Creating state cache with size 0
19/12/05 21:49:31 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41209.
19/12/05 21:49:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/05 21:49:31 INFO sdk_worker.__init__: Control channel established.
19/12/05 21:49:31 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 21:49:31 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41709.
19/12/05 21:49:31 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 21:49:31 INFO data_plane.create_data_channel: Creating client data channel for localhost:40809
19/12/05 21:49:31 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 21:49:31 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 21:49:31 INFO sdk_worker.run: No more requests from control plane
19/12/05 21:49:31 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 21:49:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 21:49:31 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 21:49:31 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 21:49:31 INFO sdk_worker.run: Done consuming work.
19/12/05 21:49:31 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 21:49:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 21:49:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 21:49:31 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 21:49:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 21:49:32 INFO sdk_worker_main.main: Logging handler created.
19/12/05 21:49:32 INFO sdk_worker_main.start: Status HTTP server running at localhost:37051
19/12/05 21:49:32 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 21:49:32 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 21:49:32 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575582568.24_18838da5-f0c0-4a56-84b1-61ef8c859b20', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 21:49:32 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575582568.24', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60929', 'job_port': u'0'}
19/12/05 21:49:32 INFO statecache.__init__: Creating state cache with size 0
19/12/05 21:49:32 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38905.
19/12/05 21:49:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/05 21:49:32 INFO sdk_worker.__init__: Control channel established.
19/12/05 21:49:32 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 21:49:32 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34223.
19/12/05 21:49:32 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 21:49:32 INFO data_plane.create_data_channel: Creating client data channel for localhost:39591
19/12/05 21:49:32 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 21:49:32 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 21:49:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 21:49:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: 1 Beam Fn Logging clients still connected during shutdown.
19/12/05 21:49:32 INFO sdk_worker.run: No more requests from control plane
19/12/05 21:49:32 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 21:49:32 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 21:49:32 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 21:49:32 INFO sdk_worker.run: Done consuming work.
19/12/05 21:49:32 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 21:49:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 21:49:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 21:49:33 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 21:49:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 21:49:33 INFO sdk_worker_main.main: Logging handler created.
19/12/05 21:49:33 INFO sdk_worker_main.start: Status HTTP server running at localhost:33783
19/12/05 21:49:33 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 21:49:33 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 21:49:33 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575582568.24_18838da5-f0c0-4a56-84b1-61ef8c859b20', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 21:49:33 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575582568.24', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60929', 'job_port': u'0'}
19/12/05 21:49:33 INFO statecache.__init__: Creating state cache with size 0
19/12/05 21:49:33 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39661.
19/12/05 21:49:33 INFO sdk_worker.__init__: Control channel established.
19/12/05 21:49:33 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 21:49:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/05 21:49:33 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38687.
19/12/05 21:49:33 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 21:49:33 INFO data_plane.create_data_channel: Creating client data channel for localhost:35303
19/12/05 21:49:33 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 21:49:33 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 21:49:33 INFO sdk_worker.run: No more requests from control plane
19/12/05 21:49:33 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 21:49:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 21:49:33 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 21:49:33 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 21:49:33 INFO sdk_worker.run: Done consuming work.
19/12/05 21:49:33 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 21:49:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 21:49:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 21:49:34 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 21:49:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 21:49:35 INFO sdk_worker_main.main: Logging handler created.
19/12/05 21:49:35 INFO sdk_worker_main.start: Status HTTP server running at localhost:44721
19/12/05 21:49:35 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 21:49:35 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 21:49:35 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575582568.24_18838da5-f0c0-4a56-84b1-61ef8c859b20', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 21:49:35 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575582568.24', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60929', 'job_port': u'0'}
19/12/05 21:49:35 INFO statecache.__init__: Creating state cache with size 0
19/12/05 21:49:35 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36211.
19/12/05 21:49:35 INFO sdk_worker.__init__: Control channel established.
19/12/05 21:49:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/05 21:49:35 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 21:49:35 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35419.
19/12/05 21:49:35 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 21:49:35 INFO data_plane.create_data_channel: Creating client data channel for localhost:46257
19/12/05 21:49:35 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 21:49:35 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 21:49:35 INFO sdk_worker.run: No more requests from control plane
19/12/05 21:49:35 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 21:49:35 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 21:49:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 21:49:35 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 21:49:35 INFO sdk_worker.run: Done consuming work.
19/12/05 21:49:35 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 21:49:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 21:49:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 21:49:35 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575582568.24_18838da5-f0c0-4a56-84b1-61ef8c859b20 finished.
19/12/05 21:49:35 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/05 21:49:35 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_2cb1f1bc-ec9c-41f8-928e-4f84e5585b15","basePath":"/tmp/sparktestTiQJHU"}: {}
java.io.FileNotFoundException: /tmp/sparktestTiQJHU/job_2cb1f1bc-ec9c-41f8-928e-4f84e5585b15/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140383749138176)>

# Thread: <Thread(Thread-116, started daemon 140383740745472)>

# Thread: <_MainThread(MainThread, started 140384878999296)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140383723960064)>

# Thread: <Thread(Thread-122, started daemon 140383732352768)>

# Thread: <Thread(Thread-116, started daemon 140383740745472)>

# Thread: <_MainThread(MainThread, started 140384878999296)>

# Thread: <Thread(wait_until_finish_read, started daemon 140383749138176)>

======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575582556.57_edf6f650-03ae-48b9-858f-23d1f7c74346 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 373.944s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 30s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/rrp7aygltzzve

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1704

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1704/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/05 18:32:42 INFO sdk_worker_main.start: Status HTTP server running at localhost:38127
19/12/05 18:32:42 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 18:32:42 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 18:32:42 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575570760.32_861b4ca6-88b5-4795-8169-271b5f8ae1ef', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 18:32:42 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575570760.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53783', 'job_port': u'0'}
19/12/05 18:32:42 INFO statecache.__init__: Creating state cache with size 0
19/12/05 18:32:42 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46433.
19/12/05 18:32:42 INFO sdk_worker.__init__: Control channel established.
19/12/05 18:32:42 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/05 18:32:42 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 18:32:42 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43869.
19/12/05 18:32:42 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 18:32:42 INFO data_plane.create_data_channel: Creating client data channel for localhost:42865
19/12/05 18:32:42 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 18:32:43 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 18:32:43 INFO sdk_worker.run: No more requests from control plane
19/12/05 18:32:43 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 18:32:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 18:32:43 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 18:32:43 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 18:32:43 INFO sdk_worker.run: Done consuming work.
19/12/05 18:32:43 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 18:32:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 18:32:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 18:32:43 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 18:32:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 18:32:44 INFO sdk_worker_main.main: Logging handler created.
19/12/05 18:32:44 INFO sdk_worker_main.start: Status HTTP server running at localhost:35857
19/12/05 18:32:44 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 18:32:44 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 18:32:44 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575570760.32_861b4ca6-88b5-4795-8169-271b5f8ae1ef', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 18:32:44 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575570760.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53783', 'job_port': u'0'}
19/12/05 18:32:44 INFO statecache.__init__: Creating state cache with size 0
19/12/05 18:32:44 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44565.
19/12/05 18:32:44 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/05 18:32:44 INFO sdk_worker.__init__: Control channel established.
19/12/05 18:32:44 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 18:32:44 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40453.
19/12/05 18:32:44 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 18:32:44 INFO data_plane.create_data_channel: Creating client data channel for localhost:45579
19/12/05 18:32:44 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 18:32:44 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 18:32:44 INFO sdk_worker.run: No more requests from control plane
19/12/05 18:32:44 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 18:32:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 18:32:44 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 18:32:44 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 18:32:44 INFO sdk_worker.run: Done consuming work.
19/12/05 18:32:44 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 18:32:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 18:32:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 18:32:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 18:32:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 18:32:45 INFO sdk_worker_main.main: Logging handler created.
19/12/05 18:32:45 INFO sdk_worker_main.start: Status HTTP server running at localhost:40913
19/12/05 18:32:45 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 18:32:45 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 18:32:45 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575570760.32_861b4ca6-88b5-4795-8169-271b5f8ae1ef', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 18:32:45 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575570760.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53783', 'job_port': u'0'}
19/12/05 18:32:45 INFO statecache.__init__: Creating state cache with size 0
19/12/05 18:32:45 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35943.
19/12/05 18:32:45 INFO sdk_worker.__init__: Control channel established.
19/12/05 18:32:45 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 18:32:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/05 18:32:45 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45179.
19/12/05 18:32:45 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 18:32:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:38599
19/12/05 18:32:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 18:32:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 18:32:45 INFO sdk_worker.run: No more requests from control plane
19/12/05 18:32:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 18:32:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 18:32:45 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 18:32:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 18:32:45 INFO sdk_worker.run: Done consuming work.
19/12/05 18:32:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 18:32:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 18:32:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 18:32:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 18:32:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 18:32:46 INFO sdk_worker_main.main: Logging handler created.
19/12/05 18:32:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:38977
19/12/05 18:32:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 18:32:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 18:32:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575570760.32_861b4ca6-88b5-4795-8169-271b5f8ae1ef', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 18:32:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575570760.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53783', 'job_port': u'0'}
19/12/05 18:32:46 INFO statecache.__init__: Creating state cache with size 0
19/12/05 18:32:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34881.
19/12/05 18:32:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/05 18:32:46 INFO sdk_worker.__init__: Control channel established.
19/12/05 18:32:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 18:32:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46603.
19/12/05 18:32:46 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 18:32:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:37179
19/12/05 18:32:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 18:32:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 18:32:46 INFO sdk_worker.run: No more requests from control plane
19/12/05 18:32:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 18:32:46 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 18:32:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 18:32:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 18:32:46 INFO sdk_worker.run: Done consuming work.
19/12/05 18:32:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 18:32:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 18:32:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 18:32:47 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575570760.32_861b4ca6-88b5-4795-8169-271b5f8ae1ef finished.
19/12/05 18:32:47 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/05 18:32:47 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_20ad8490-0c89-40ee-a22c-589bb52f7a7d","basePath":"/tmp/sparktestkd29Mf"}: {}
java.io.FileNotFoundException: /tmp/sparktestkd29Mf/job_20ad8490-0c89-40ee-a22c-589bb52f7a7d/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140152693319424)>

BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-116, started daemon 140152676534016)>

======================================================================
# Thread: <_MainThread(MainThread, started 140153679021824)>
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
==================== Timed out after 60 seconds. ====================

    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 140152659748608)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-122, started daemon 140152668141312)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 140153679021824)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-116, started daemon 140152676534016)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(wait_until_finish_read, started daemon 140152693319424)>
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575570749.81_9933592b-5e39-4ea2-ba36-ab218142165f failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 379.471s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 52s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/bqxfjrkcycdoi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1703

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1703/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-8861] Disallow self-signed certificates by default in


------------------------------------------
[...truncated 1.31 MB...]
19/12/05 16:42:13 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 16:42:13 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575564128.1_eb5f6f58-c67d-4d6d-9c30-7758bbd527e5', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=29', u'--enable_spark_metric_sinks'] 
19/12/05 16:42:13 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575564128.1', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49997', 'job_port': u'0'}
19/12/05 16:42:13 INFO statecache.__init__: Creating state cache with size 0
19/12/05 16:42:13 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43467.
19/12/05 16:42:13 INFO sdk_worker.__init__: Control channel established.
19/12/05 16:42:13 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 16:42:13 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 256-1
19/12/05 16:42:13 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39263.
19/12/05 16:42:13 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 16:42:13 INFO data_plane.create_data_channel: Creating client data channel for localhost:40081
19/12/05 16:42:13 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 16:42:13 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 16:42:14 INFO sdk_worker.run: No more requests from control plane
19/12/05 16:42:14 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 16:42:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 16:42:14 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 16:42:14 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 16:42:14 INFO sdk_worker.run: Done consuming work.
19/12/05 16:42:14 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 16:42:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 16:42:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 16:42:14 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 16:42:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 16:42:15 INFO sdk_worker_main.main: Logging handler created.
19/12/05 16:42:15 INFO sdk_worker_main.start: Status HTTP server running at localhost:41233
19/12/05 16:42:15 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 16:42:15 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 16:42:15 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575564128.1_eb5f6f58-c67d-4d6d-9c30-7758bbd527e5', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=29', u'--enable_spark_metric_sinks'] 
19/12/05 16:42:15 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575564128.1', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49997', 'job_port': u'0'}
19/12/05 16:42:15 INFO statecache.__init__: Creating state cache with size 0
19/12/05 16:42:15 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37317.
19/12/05 16:42:15 INFO sdk_worker.__init__: Control channel established.
19/12/05 16:42:15 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 16:42:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 257-1
19/12/05 16:42:15 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44115.
19/12/05 16:42:15 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 16:42:15 INFO data_plane.create_data_channel: Creating client data channel for localhost:33059
19/12/05 16:42:15 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 16:42:15 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 16:42:15 INFO sdk_worker.run: No more requests from control plane
19/12/05 16:42:15 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 16:42:15 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 16:42:15 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 16:42:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 16:42:15 INFO sdk_worker.run: Done consuming work.
19/12/05 16:42:15 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 16:42:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 16:42:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 16:42:15 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 16:42:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 16:42:16 INFO sdk_worker_main.main: Logging handler created.
19/12/05 16:42:16 INFO sdk_worker_main.start: Status HTTP server running at localhost:38479
19/12/05 16:42:16 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 16:42:16 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 16:42:16 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575564128.1_eb5f6f58-c67d-4d6d-9c30-7758bbd527e5', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=29', u'--enable_spark_metric_sinks'] 
19/12/05 16:42:16 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575564128.1', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49997', 'job_port': u'0'}
19/12/05 16:42:16 INFO statecache.__init__: Creating state cache with size 0
19/12/05 16:42:16 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39863.
19/12/05 16:42:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 258-1
19/12/05 16:42:16 INFO sdk_worker.__init__: Control channel established.
19/12/05 16:42:16 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 16:42:16 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36305.
19/12/05 16:42:16 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 16:42:16 INFO data_plane.create_data_channel: Creating client data channel for localhost:33037
19/12/05 16:42:16 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 16:42:16 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 16:42:16 INFO sdk_worker.run: No more requests from control plane
19/12/05 16:42:16 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 16:42:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 16:42:16 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 16:42:16 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 16:42:16 INFO sdk_worker.run: Done consuming work.
19/12/05 16:42:16 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 16:42:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 16:42:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 16:42:16 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575564128.1_eb5f6f58-c67d-4d6d-9c30-7758bbd527e5 finished.
19/12/05 16:42:16 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/05 16:42:16 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_d848e517-2981-45a5-ade7-0da5f59aa064","basePath":"/tmp/sparktestTFVGYL"}: {}
java.io.FileNotFoundException: /tmp/sparktestTFVGYL/job_d848e517-2981-45a5-ade7-0da5f59aa064/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140607776327424)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)

BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-119, started daemon 140607699810048)>

======================================================================
ERROR: test_pardo_unfusable_side_inputs (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <_MainThread(MainThread, started 140608556066560)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 244, in test_pardo_unfusable_side_inputs
# Thread: <Thread(wait_until_finish_read, started daemon 140607674631936)>

    equal_to([('a', 'a'), ('a', 'b')# Thread: <Thread(Thread-125, started daemon 140607683024640)>

# Thread: <_MainThread(MainThread, started 140608556066560)>

, ('b', 'a'), ('b', 'b')]))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(Thread-119, started daemon 140607699810048)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 140607776327424)>
==================== Timed out after 60 seconds. ====================
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140607657846528)>

# Thread: <Thread(Thread-131, started daemon 140607666239232)>

# Thread: <_MainThread(MainThread, started 140608556066560)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575564114.56_6ecc68d8-2713-4bed-8849-a62f2bfb686a failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 397.624s

FAILED (errors=4, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 50s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/x2n2ne2pzuiqe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1702

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1702/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/05 12:26:17 INFO sdk_worker_main.start: Status HTTP server running at localhost:33713
19/12/05 12:26:17 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 12:26:17 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 12:26:17 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575548774.92_076372a3-5ec6-4402-8fff-965f08690344', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 12:26:17 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575548774.92', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:32945', 'job_port': u'0'}
19/12/05 12:26:17 INFO statecache.__init__: Creating state cache with size 0
19/12/05 12:26:17 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38503.
19/12/05 12:26:17 INFO sdk_worker.__init__: Control channel established.
19/12/05 12:26:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/05 12:26:17 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 12:26:17 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39167.
19/12/05 12:26:17 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 12:26:17 INFO data_plane.create_data_channel: Creating client data channel for localhost:41327
19/12/05 12:26:17 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 12:26:17 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 12:26:17 INFO sdk_worker.run: No more requests from control plane
19/12/05 12:26:17 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 12:26:17 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 12:26:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 12:26:17 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 12:26:17 INFO sdk_worker.run: Done consuming work.
19/12/05 12:26:17 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 12:26:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 12:26:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 12:26:18 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 12:26:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 12:26:18 INFO sdk_worker_main.main: Logging handler created.
19/12/05 12:26:18 INFO sdk_worker_main.start: Status HTTP server running at localhost:37165
19/12/05 12:26:18 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 12:26:18 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 12:26:18 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575548774.92_076372a3-5ec6-4402-8fff-965f08690344', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 12:26:18 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575548774.92', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:32945', 'job_port': u'0'}
19/12/05 12:26:18 INFO statecache.__init__: Creating state cache with size 0
19/12/05 12:26:18 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43691.
19/12/05 12:26:18 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/05 12:26:18 INFO sdk_worker.__init__: Control channel established.
19/12/05 12:26:18 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 12:26:18 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35757.
19/12/05 12:26:18 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 12:26:18 INFO data_plane.create_data_channel: Creating client data channel for localhost:37521
19/12/05 12:26:18 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 12:26:18 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 12:26:18 INFO sdk_worker.run: No more requests from control plane
19/12/05 12:26:18 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 12:26:18 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 12:26:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 12:26:18 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 12:26:18 INFO sdk_worker.run: Done consuming work.
19/12/05 12:26:18 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 12:26:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 12:26:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 12:26:18 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 12:26:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 12:26:19 INFO sdk_worker_main.main: Logging handler created.
19/12/05 12:26:19 INFO sdk_worker_main.start: Status HTTP server running at localhost:42201
19/12/05 12:26:19 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 12:26:19 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 12:26:19 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575548774.92_076372a3-5ec6-4402-8fff-965f08690344', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 12:26:19 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575548774.92', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:32945', 'job_port': u'0'}
19/12/05 12:26:19 INFO statecache.__init__: Creating state cache with size 0
19/12/05 12:26:19 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38693.
19/12/05 12:26:19 INFO sdk_worker.__init__: Control channel established.
19/12/05 12:26:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/05 12:26:19 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 12:26:19 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38489.
19/12/05 12:26:19 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 12:26:19 INFO data_plane.create_data_channel: Creating client data channel for localhost:38605
19/12/05 12:26:19 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 12:26:19 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 12:26:19 INFO sdk_worker.run: No more requests from control plane
19/12/05 12:26:19 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 12:26:19 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 12:26:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 12:26:19 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 12:26:19 INFO sdk_worker.run: Done consuming work.
19/12/05 12:26:19 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 12:26:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 12:26:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 12:26:19 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 12:26:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 12:26:20 INFO sdk_worker_main.main: Logging handler created.
19/12/05 12:26:20 INFO sdk_worker_main.start: Status HTTP server running at localhost:45411
19/12/05 12:26:20 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 12:26:20 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 12:26:20 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575548774.92_076372a3-5ec6-4402-8fff-965f08690344', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 12:26:20 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575548774.92', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:32945', 'job_port': u'0'}
19/12/05 12:26:20 INFO statecache.__init__: Creating state cache with size 0
19/12/05 12:26:20 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33957.
19/12/05 12:26:20 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/05 12:26:20 INFO sdk_worker.__init__: Control channel established.
19/12/05 12:26:20 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 12:26:20 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35965.
19/12/05 12:26:20 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 12:26:20 INFO data_plane.create_data_channel: Creating client data channel for localhost:38135
19/12/05 12:26:20 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 12:26:20 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 12:26:20 INFO sdk_worker.run: No more requests from control plane
19/12/05 12:26:20 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 12:26:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 12:26:20 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 12:26:20 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 12:26:20 INFO sdk_worker.run: Done consuming work.
19/12/05 12:26:20 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 12:26:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 12:26:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 12:26:20 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575548774.92_076372a3-5ec6-4402-8fff-965f08690344 finished.
19/12/05 12:26:20 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/05 12:26:20 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_431ac23b-0ea2-4755-bddb-9cd9d8522d5f","basePath":"/tmp/sparktestpH07nU"}: {}
java.io.FileNotFoundException: /tmp/sparktestpH07nU/job_431ac23b-0ea2-4755-bddb-9cd9d8522d5f/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139928603416320)>

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(Thread-120, started daemon 139928595023616)>

  File "apache_beam/runners/portability/portable_ru# Thread: <_MainThread(MainThread, started 139929391048448)>
==================== Timed out after 60 seconds. ====================

nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 139928585844480)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-126, started daemon 139928577451776)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 139929391048448)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139928603416320)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
# Thread: <Thread(Thread-120, started daemon 139928595023616)>
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575548765.75_b3efb9cb-9ad7-4736-b797-0181a01a1729 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 319.042s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 56s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://scans.gradle.com/s/36dpr4ou7alai

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1701

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1701/display/redirect?page=changes>

Changes:

[lgajowy] [BEAM-6627] Add size reporting to JdbcIOIT (#10267)


------------------------------------------
[...truncated 1.39 MB...]
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
==================== Timed out after 60 seconds. ====================
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_unfusable_side_inputs (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 244, in test_pardo_unfusable_side_inputs
    equal_to([('a', 'a'), ('a', 'b'), ('b', 'a'), ('b', 'b')]))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_windowed_side_inputs (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 181, in test_pardo_windowed_side_inputs
    label='windowed')
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_read (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 578, in test_read
    equal_to(['a', 'b', 'c']))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575543622.22_265bae3e-05be-4ee5-b555-1f2776f8ffd2 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 703.331s

FAILED (errors=6, skipped=9)

# Thread: <Thread(wait_until_finish_read, started daemon 140102398502656)>

# Thread: <Thread(Thread-93, started daemon 140102406895360)>

# Thread: <_MainThread(MainThread, started 140103195027200)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140102381717248)>

# Thread: <Thread(Thread-97, started daemon 140102390109952)>

# Thread: <_MainThread(MainThread, started 140103195027200)>

# Thread: <Thread(Thread-93, started daemon 140102406895360)>

# Thread: <Thread(wait_until_finish_read, started daemon 140102398502656)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140101757691648)>

# Thread: <Thread(Thread-102, started daemon 140101766084352)>

# Thread: <Thread(wait_until_finish_read, started daemon 140102381717248)>

# Thread: <Thread(wait_until_finish_read, started daemon 140102398502656)>

# Thread: <Thread(Thread-93, started daemon 140102406895360)>

# Thread: <_MainThread(MainThread, started 140103195027200)>

# Thread: <Thread(Thread-97, started daemon 140102390109952)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140101740906240)>

# Thread: <Thread(Thread-107, started daemon 140101749298944)>

# Thread: <Thread(wait_until_finish_read, started daemon 140102381717248)>

# Thread: <Thread(Thread-97, started daemon 140102390109952)>

# Thread: <Thread(Thread-93, started daemon 140102406895360)>

# Thread: <Thread(wait_until_finish_read, started daemon 140101757691648)>

# Thread: <Thread(wait_until_finish_read, started daemon 140102398502656)>

# Thread: <_MainThread(MainThread, started 140103195027200)>

# Thread: <Thread(Thread-102, started daemon 140101766084352)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140101724120832)>

# Thread: <Thread(Thread-111, started daemon 140101732513536)>

# Thread: <Thread(wait_until_finish_read, started daemon 140101740906240)>

# Thread: <Thread(Thread-107, started daemon 140101749298944)>

# Thread: <Thread(wait_until_finish_read, started daemon 140101757691648)>

# Thread: <Thread(Thread-97, started daemon 140102390109952)>

# Thread: <Thread(wait_until_finish_read, started daemon 140102381717248)>

# Thread: <_MainThread(MainThread, started 140103195027200)>

# Thread: <Thread(Thread-102, started daemon 140101766084352)>

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 16m 31s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/ibemntcczq5uy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1700

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1700/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/05 06:15:05 INFO sdk_worker_main.start: Status HTTP server running at localhost:45983
19/12/05 06:15:05 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 06:15:05 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 06:15:05 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575526503.05_79ee8e7c-9018-49c7-8de9-5467b0b12eef', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 06:15:05 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575526503.05', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60481', 'job_port': u'0'}
19/12/05 06:15:05 INFO statecache.__init__: Creating state cache with size 0
19/12/05 06:15:05 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43029.
19/12/05 06:15:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/05 06:15:05 INFO sdk_worker.__init__: Control channel established.
19/12/05 06:15:05 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 06:15:05 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39189.
19/12/05 06:15:05 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 06:15:05 INFO data_plane.create_data_channel: Creating client data channel for localhost:37847
19/12/05 06:15:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 06:15:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 06:15:05 INFO sdk_worker.run: No more requests from control plane
19/12/05 06:15:05 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 06:15:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 06:15:05 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 06:15:05 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 06:15:05 INFO sdk_worker.run: Done consuming work.
19/12/05 06:15:05 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 06:15:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 06:15:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 06:15:06 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 06:15:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 06:15:06 INFO sdk_worker_main.main: Logging handler created.
19/12/05 06:15:06 INFO sdk_worker_main.start: Status HTTP server running at localhost:44249
19/12/05 06:15:06 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 06:15:06 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 06:15:06 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575526503.05_79ee8e7c-9018-49c7-8de9-5467b0b12eef', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 06:15:06 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575526503.05', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60481', 'job_port': u'0'}
19/12/05 06:15:06 INFO statecache.__init__: Creating state cache with size 0
19/12/05 06:15:06 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44861.
19/12/05 06:15:06 INFO sdk_worker.__init__: Control channel established.
19/12/05 06:15:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/05 06:15:06 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 06:15:06 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46403.
19/12/05 06:15:06 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 06:15:06 INFO data_plane.create_data_channel: Creating client data channel for localhost:42515
19/12/05 06:15:06 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 06:15:06 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 06:15:06 INFO sdk_worker.run: No more requests from control plane
19/12/05 06:15:06 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 06:15:06 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 06:15:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 06:15:06 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 06:15:06 INFO sdk_worker.run: Done consuming work.
19/12/05 06:15:06 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 06:15:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 06:15:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 06:15:07 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 06:15:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 06:15:07 INFO sdk_worker_main.main: Logging handler created.
19/12/05 06:15:07 INFO sdk_worker_main.start: Status HTTP server running at localhost:44225
19/12/05 06:15:07 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 06:15:07 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 06:15:07 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575526503.05_79ee8e7c-9018-49c7-8de9-5467b0b12eef', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 06:15:07 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575526503.05', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60481', 'job_port': u'0'}
19/12/05 06:15:07 INFO statecache.__init__: Creating state cache with size 0
19/12/05 06:15:07 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38295.
19/12/05 06:15:07 INFO sdk_worker.__init__: Control channel established.
19/12/05 06:15:07 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 06:15:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/05 06:15:07 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44447.
19/12/05 06:15:07 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 06:15:07 INFO data_plane.create_data_channel: Creating client data channel for localhost:32939
19/12/05 06:15:07 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 06:15:07 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 06:15:07 INFO sdk_worker.run: No more requests from control plane
19/12/05 06:15:07 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 06:15:07 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 06:15:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 06:15:07 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 06:15:07 INFO sdk_worker.run: Done consuming work.
19/12/05 06:15:07 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 06:15:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 06:15:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 06:15:07 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 06:15:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 06:15:08 INFO sdk_worker_main.main: Logging handler created.
19/12/05 06:15:08 INFO sdk_worker_main.start: Status HTTP server running at localhost:39475
19/12/05 06:15:08 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 06:15:08 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 06:15:08 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575526503.05_79ee8e7c-9018-49c7-8de9-5467b0b12eef', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 06:15:08 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575526503.05', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60481', 'job_port': u'0'}
19/12/05 06:15:08 INFO statecache.__init__: Creating state cache with size 0
19/12/05 06:15:08 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39067.
19/12/05 06:15:08 INFO sdk_worker.__init__: Control channel established.
19/12/05 06:15:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/05 06:15:08 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 06:15:08 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43991.
19/12/05 06:15:08 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 06:15:08 INFO data_plane.create_data_channel: Creating client data channel for localhost:39923
19/12/05 06:15:08 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 06:15:08 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 06:15:08 INFO sdk_worker.run: No more requests from control plane
19/12/05 06:15:08 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 06:15:08 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 06:15:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 06:15:08 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 06:15:08 INFO sdk_worker.run: Done consuming work.
19/12/05 06:15:08 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 06:15:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 06:15:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 06:15:08 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575526503.05_79ee8e7c-9018-49c7-8de9-5467b0b12eef finished.
19/12/05 06:15:08 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/05 06:15:08 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_8611292d-c435-420c-a625-f81a77e88428","basePath":"/tmp/sparktestWBHZhZ"}: {}
java.io.FileNotFoundException: /tmp/sparktestWBHZhZ/job_8611292d-c435-420c-a625-f81a77e88428/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
==================== Timed out after 60 seconds. ====================

ERROR: test_pardo_timers (__main__.SparkRunnerTest)
# Thread: <Thread(wait_until_finish_read, started daemon 140445623510784)>
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-117, started daemon 140445631903488)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140446758721280)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140445606725376)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-123, started daemon 140445615118080)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140446758721280)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):

# Thread: <Thread(Thread-117, started daemon 140445631903488)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
# Thread: <Thread(wait_until_finish_read, started daemon 140445623510784)>
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575526492.68_dad866b2-ace0-4dd5-82bb-bdab8f8cc326 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 337.066s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 19s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/m64j4wz465avm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1699

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1699/display/redirect?page=changes>

Changes:

[github] Merge pull request #10247: [BEAM-7274] In preparation for


------------------------------------------
[...truncated 1.32 MB...]
19/12/05 05:27:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:40601
19/12/05 05:27:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 05:27:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 05:27:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575523663.25_3e008bd1-e458-43a9-990c-a867ea567448', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 05:27:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575523663.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53773', 'job_port': u'0'}
19/12/05 05:27:46 INFO statecache.__init__: Creating state cache with size 0
19/12/05 05:27:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38243.
19/12/05 05:27:46 INFO sdk_worker.__init__: Control channel established.
19/12/05 05:27:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 05:27:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/05 05:27:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35255.
19/12/05 05:27:46 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 05:27:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:38785
19/12/05 05:27:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 05:27:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 05:27:46 INFO sdk_worker.run: No more requests from control plane
19/12/05 05:27:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 05:27:46 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 05:27:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 05:27:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 05:27:46 INFO sdk_worker.run: Done consuming work.
19/12/05 05:27:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 05:27:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 05:27:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 05:27:46 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 05:27:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 05:27:47 INFO sdk_worker_main.main: Logging handler created.
19/12/05 05:27:47 INFO sdk_worker_main.start: Status HTTP server running at localhost:45121
19/12/05 05:27:47 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 05:27:47 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 05:27:47 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575523663.25_3e008bd1-e458-43a9-990c-a867ea567448', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 05:27:47 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575523663.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53773', 'job_port': u'0'}
19/12/05 05:27:47 INFO statecache.__init__: Creating state cache with size 0
19/12/05 05:27:47 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43489.
19/12/05 05:27:47 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/05 05:27:47 INFO sdk_worker.__init__: Control channel established.
19/12/05 05:27:47 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 05:27:47 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43993.
19/12/05 05:27:47 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 05:27:47 INFO data_plane.create_data_channel: Creating client data channel for localhost:35363
19/12/05 05:27:47 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 05:27:47 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 05:27:47 INFO sdk_worker.run: No more requests from control plane
19/12/05 05:27:47 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 05:27:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 05:27:47 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 05:27:47 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 05:27:47 INFO sdk_worker.run: Done consuming work.
19/12/05 05:27:47 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 05:27:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 05:27:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 05:27:47 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 05:27:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 05:27:48 INFO sdk_worker_main.main: Logging handler created.
19/12/05 05:27:48 INFO sdk_worker_main.start: Status HTTP server running at localhost:42793
19/12/05 05:27:48 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 05:27:48 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 05:27:48 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575523663.25_3e008bd1-e458-43a9-990c-a867ea567448', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 05:27:48 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575523663.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53773', 'job_port': u'0'}
19/12/05 05:27:48 INFO statecache.__init__: Creating state cache with size 0
19/12/05 05:27:48 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44621.
19/12/05 05:27:48 INFO sdk_worker.__init__: Control channel established.
19/12/05 05:27:48 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 05:27:48 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/05 05:27:48 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41443.
19/12/05 05:27:48 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 05:27:48 INFO data_plane.create_data_channel: Creating client data channel for localhost:40193
19/12/05 05:27:48 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 05:27:48 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 05:27:48 INFO sdk_worker.run: No more requests from control plane
19/12/05 05:27:48 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 05:27:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 05:27:48 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 05:27:48 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 05:27:48 INFO sdk_worker.run: Done consuming work.
19/12/05 05:27:48 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 05:27:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 05:27:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 05:27:48 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 05:27:49 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 05:27:49 INFO sdk_worker_main.main: Logging handler created.
19/12/05 05:27:49 INFO sdk_worker_main.start: Status HTTP server running at localhost:37829
19/12/05 05:27:49 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 05:27:49 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 05:27:49 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575523663.25_3e008bd1-e458-43a9-990c-a867ea567448', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 05:27:49 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575523663.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53773', 'job_port': u'0'}
19/12/05 05:27:49 INFO statecache.__init__: Creating state cache with size 0
19/12/05 05:27:49 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36985.
19/12/05 05:27:49 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/05 05:27:49 INFO sdk_worker.__init__: Control channel established.
19/12/05 05:27:49 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 05:27:49 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40823.
19/12/05 05:27:49 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 05:27:49 INFO data_plane.create_data_channel: Creating client data channel for localhost:34525
19/12/05 05:27:49 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 05:27:49 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 05:27:49 INFO sdk_worker.run: No more requests from control plane
19/12/05 05:27:49 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 05:27:49 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 05:27:49 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 05:27:49 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 05:27:49 INFO sdk_worker.run: Done consuming work.
19/12/05 05:27:49 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 05:27:49 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 05:27:49 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 05:27:49 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575523663.25_3e008bd1-e458-43a9-990c-a867ea567448 finished.
19/12/05 05:27:49 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/05 05:27:49 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_489573cc-5c3a-4bd5-b29e-4382134f7bc6","basePath":"/tmp/sparktestZzLP69"}: {}
java.io.FileNotFoundException: /tmp/sparktestZzLP69/job_489573cc-5c3a-4bd5-b29e-4382134f7bc6/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
# Thread: <Thread(wait_until_finish_read, started daemon 139802192115456)>

# Thread: <Thread(Thread-119, started daemon 139802273539840)>

# Thread: <_MainThread(MainThread, started 139803060610816)>
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(wait_until_finish_read, started daemon 139802175330048)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575523652.41_bc04077e-d636-4176-b465-dc5537d9bd76 failed in state FAILED: java.lang.UnsupportedOperationException: The A# Thread: <Thread(Thread-125, started daemon 139802183722752)>

# Thread: <_MainThread(MainThread, started 139803060610816)>

ctiveBundle does not have a registered bundle checkpoint handler.

# Thread: <Thread(Thread-119, started daemon 139802273539840)>

----------------------------------------------------------------------
Ran 38 tests in 345.521s

# Thread: <Thread(wait_until_finish_read, started daemon 139802192115456)>
FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 13s
60 actionable tasks: 51 executed, 9 from cache

Publishing build scan...
https://scans.gradle.com/s/bwyyljxtbj5tu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1698

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1698/display/redirect?page=changes>

Changes:

[chamikara] [BEAM-8884] Fix mongodb splitVector command result type issue (#10282)


------------------------------------------
[...truncated 1.32 MB...]
19/12/05 02:08:38 INFO sdk_worker_main.start: Status HTTP server running at localhost:41881
19/12/05 02:08:38 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 02:08:38 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 02:08:38 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575511716.49_a0b84da8-ec4d-49d8-bef1-47b0d83c9ea3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 02:08:38 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575511716.49', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41481', 'job_port': u'0'}
19/12/05 02:08:38 INFO statecache.__init__: Creating state cache with size 0
19/12/05 02:08:38 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38465.
19/12/05 02:08:38 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/05 02:08:38 INFO sdk_worker.__init__: Control channel established.
19/12/05 02:08:38 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 02:08:38 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43791.
19/12/05 02:08:38 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 02:08:38 INFO data_plane.create_data_channel: Creating client data channel for localhost:33035
19/12/05 02:08:38 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 02:08:39 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 02:08:39 INFO sdk_worker.run: No more requests from control plane
19/12/05 02:08:39 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 02:08:39 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 02:08:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 02:08:39 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 02:08:39 INFO sdk_worker.run: Done consuming work.
19/12/05 02:08:39 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 02:08:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 02:08:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 02:08:39 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 02:08:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 02:08:39 INFO sdk_worker_main.main: Logging handler created.
19/12/05 02:08:39 INFO sdk_worker_main.start: Status HTTP server running at localhost:46663
19/12/05 02:08:39 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 02:08:39 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 02:08:39 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575511716.49_a0b84da8-ec4d-49d8-bef1-47b0d83c9ea3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 02:08:39 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575511716.49', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41481', 'job_port': u'0'}
19/12/05 02:08:39 INFO statecache.__init__: Creating state cache with size 0
19/12/05 02:08:39 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42851.
19/12/05 02:08:39 INFO sdk_worker.__init__: Control channel established.
19/12/05 02:08:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/05 02:08:39 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 02:08:39 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40801.
19/12/05 02:08:39 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 02:08:39 INFO data_plane.create_data_channel: Creating client data channel for localhost:33585
19/12/05 02:08:39 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 02:08:39 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 02:08:39 INFO sdk_worker.run: No more requests from control plane
19/12/05 02:08:39 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 02:08:39 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 02:08:39 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 02:08:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 02:08:39 INFO sdk_worker.run: Done consuming work.
19/12/05 02:08:39 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 02:08:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 02:08:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 02:08:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 02:08:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 02:08:40 INFO sdk_worker_main.main: Logging handler created.
19/12/05 02:08:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:35083
19/12/05 02:08:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 02:08:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 02:08:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575511716.49_a0b84da8-ec4d-49d8-bef1-47b0d83c9ea3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 02:08:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575511716.49', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41481', 'job_port': u'0'}
19/12/05 02:08:40 INFO statecache.__init__: Creating state cache with size 0
19/12/05 02:08:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35103.
19/12/05 02:08:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/05 02:08:40 INFO sdk_worker.__init__: Control channel established.
19/12/05 02:08:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 02:08:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34895.
19/12/05 02:08:40 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 02:08:40 INFO data_plane.create_data_channel: Creating client data channel for localhost:46319
19/12/05 02:08:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 02:08:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 02:08:40 INFO sdk_worker.run: No more requests from control plane
19/12/05 02:08:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 02:08:40 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 02:08:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 02:08:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 02:08:40 INFO sdk_worker.run: Done consuming work.
19/12/05 02:08:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 02:08:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 02:08:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 02:08:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 02:08:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 02:08:41 INFO sdk_worker_main.main: Logging handler created.
19/12/05 02:08:41 INFO sdk_worker_main.start: Status HTTP server running at localhost:33173
19/12/05 02:08:41 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 02:08:41 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 02:08:41 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575511716.49_a0b84da8-ec4d-49d8-bef1-47b0d83c9ea3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 02:08:41 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575511716.49', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41481', 'job_port': u'0'}
19/12/05 02:08:41 INFO statecache.__init__: Creating state cache with size 0
19/12/05 02:08:41 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33569.
19/12/05 02:08:41 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/05 02:08:41 INFO sdk_worker.__init__: Control channel established.
19/12/05 02:08:41 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 02:08:41 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45517.
19/12/05 02:08:41 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 02:08:41 INFO data_plane.create_data_channel: Creating client data channel for localhost:34543
19/12/05 02:08:41 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 02:08:41 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 02:08:41 INFO sdk_worker.run: No more requests from control plane
19/12/05 02:08:41 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 02:08:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 02:08:41 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 02:08:41 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 02:08:41 INFO sdk_worker.run: Done consuming work.
19/12/05 02:08:41 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 02:08:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 02:08:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 02:08:41 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575511716.49_a0b84da8-ec4d-49d8-bef1-47b0d83c9ea3 finished.
19/12/05 02:08:41 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/05 02:08:41 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_cb1e17ec-87ad-4716-91ab-cef3cf84f2fa","basePath":"/tmp/sparktestfoljaI"}: {}
java.io.FileNotFoundException: /tmp/sparktestfoljaI/job_cb1e17ec-87ad-4716-91ab-cef3cf84f2fa/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140578402621184)>

# Thread: <Thread(Thread-120, started daemon 140578411013888)>

# Thread: <_MainThread(MainThread, started 140579199145728)>
==================== Timed out after 60 seconds. ====================

ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
# Thread: <Thread(wait_until_finish_read, started daemon 140578393704192)>

# Thread: <Thread(Thread-126, started daemon 140578385311488)>

Traceback (most recent call last):
# Thread: <Thread(Thread-120, started daemon 140578411013888)>

# Thread: <_MainThread(MainThread, started 140579199145728)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))

# Thread: <Thread(wait_until_finish_read, started daemon 140578402621184)>
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575511707.59_240e3073-c14d-4461-a4b9-15d04bc20a71 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 293.358s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 18s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/ovlfydy6mzbca

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1697

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1697/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/05 00:38:45 INFO sdk_worker_main.start: Status HTTP server running at localhost:35863
19/12/05 00:38:45 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 00:38:45 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 00:38:45 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575506322.25_d3811400-701b-4125-a7e2-00b263c5c8ca', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 00:38:45 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575506322.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36133', 'job_port': u'0'}
19/12/05 00:38:45 INFO statecache.__init__: Creating state cache with size 0
19/12/05 00:38:45 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38485.
19/12/05 00:38:45 INFO sdk_worker.__init__: Control channel established.
19/12/05 00:38:45 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 00:38:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/05 00:38:45 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44171.
19/12/05 00:38:45 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 00:38:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:33657
19/12/05 00:38:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 00:38:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 00:38:45 INFO sdk_worker.run: No more requests from control plane
19/12/05 00:38:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 00:38:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 00:38:45 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 00:38:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 00:38:45 INFO sdk_worker.run: Done consuming work.
19/12/05 00:38:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 00:38:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 00:38:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 00:38:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 00:38:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 00:38:46 INFO sdk_worker_main.main: Logging handler created.
19/12/05 00:38:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:40211
19/12/05 00:38:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 00:38:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 00:38:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575506322.25_d3811400-701b-4125-a7e2-00b263c5c8ca', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 00:38:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575506322.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36133', 'job_port': u'0'}
19/12/05 00:38:46 INFO statecache.__init__: Creating state cache with size 0
19/12/05 00:38:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36701.
19/12/05 00:38:46 INFO sdk_worker.__init__: Control channel established.
19/12/05 00:38:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/05 00:38:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 00:38:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46001.
19/12/05 00:38:46 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 00:38:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:38391
19/12/05 00:38:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 00:38:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 00:38:46 INFO sdk_worker.run: No more requests from control plane
19/12/05 00:38:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 00:38:46 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 00:38:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 00:38:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 00:38:46 INFO sdk_worker.run: Done consuming work.
19/12/05 00:38:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 00:38:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 00:38:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 00:38:46 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 00:38:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 00:38:47 INFO sdk_worker_main.main: Logging handler created.
19/12/05 00:38:47 INFO sdk_worker_main.start: Status HTTP server running at localhost:35551
19/12/05 00:38:47 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 00:38:47 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 00:38:47 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575506322.25_d3811400-701b-4125-a7e2-00b263c5c8ca', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 00:38:47 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575506322.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36133', 'job_port': u'0'}
19/12/05 00:38:47 INFO statecache.__init__: Creating state cache with size 0
19/12/05 00:38:47 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44557.
19/12/05 00:38:47 INFO sdk_worker.__init__: Control channel established.
19/12/05 00:38:47 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/05 00:38:47 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 00:38:47 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45465.
19/12/05 00:38:47 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 00:38:47 INFO data_plane.create_data_channel: Creating client data channel for localhost:33819
19/12/05 00:38:47 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 00:38:47 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 00:38:47 INFO sdk_worker.run: No more requests from control plane
19/12/05 00:38:47 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 00:38:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 00:38:47 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 00:38:47 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 00:38:47 INFO sdk_worker.run: Done consuming work.
19/12/05 00:38:47 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 00:38:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 00:38:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 00:38:47 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 00:38:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 00:38:48 INFO sdk_worker_main.main: Logging handler created.
19/12/05 00:38:48 INFO sdk_worker_main.start: Status HTTP server running at localhost:36921
19/12/05 00:38:48 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 00:38:48 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 00:38:48 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575506322.25_d3811400-701b-4125-a7e2-00b263c5c8ca', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 00:38:48 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575506322.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36133', 'job_port': u'0'}
19/12/05 00:38:48 INFO statecache.__init__: Creating state cache with size 0
19/12/05 00:38:48 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43365.
19/12/05 00:38:48 INFO sdk_worker.__init__: Control channel established.
19/12/05 00:38:48 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 00:38:48 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/05 00:38:48 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33921.
19/12/05 00:38:48 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 00:38:48 INFO data_plane.create_data_channel: Creating client data channel for localhost:45487
19/12/05 00:38:48 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 00:38:48 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 00:38:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 00:38:48 INFO sdk_worker.run: No more requests from control plane
19/12/05 00:38:48 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 00:38:48 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 00:38:48 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 00:38:48 INFO sdk_worker.run: Done consuming work.
19/12/05 00:38:48 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 00:38:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 00:38:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 00:38:48 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575506322.25_d3811400-701b-4125-a7e2-00b263c5c8ca finished.
19/12/05 00:38:48 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/05 00:38:48 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_a0f9c8ed-da6e-4a7e-a56d-d25fdc1f0531","basePath":"/tmp/sparktestqHzjRx"}: {}
java.io.FileNotFoundException: /tmp/sparktestqHzjRx/job_a0f9c8ed-da6e-4a7e-a56d-d25fdc1f0531/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575506311.01_1778b131-c2d7-4e74-ab57-97ce7aefe341 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 374.866s

FAILED (errors=3, skipped=9)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139975247197952)>

# Thread: <Thread(Thread-119, started daemon 139975255590656)>

# Thread: <_MainThread(MainThread, started 139976035329792)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139975221757696)>

# Thread: <Thread(Thread-125, started daemon 139975230150400)>

# Thread: <_MainThread(MainThread, started 139976035329792)>

# Thread: <Thread(wait_until_finish_read, started daemon 139975247197952)>

# Thread: <Thread(Thread-119, started daemon 139975255590656)>

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 37s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/hg46xzojugorw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1696

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1696/display/redirect?page=changes>

Changes:

[sniemitz] [BEAM-8809] Make the constructor for AvroWriteRequest public

[wenjialiu] [BEAM-8575] test_flatten_no_pcollection raises an exception and should


------------------------------------------
[...truncated 1.32 MB...]
19/12/04 23:57:59 INFO sdk_worker_main.start: Status HTTP server running at localhost:36423
19/12/04 23:57:59 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 23:57:59 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 23:57:59 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575503877.23_fd72d4ab-83ad-4431-9734-4220f7a89707', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 23:57:59 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575503877.23', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35557', 'job_port': u'0'}
19/12/04 23:57:59 INFO statecache.__init__: Creating state cache with size 0
19/12/04 23:57:59 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39919.
19/12/04 23:57:59 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/04 23:57:59 INFO sdk_worker.__init__: Control channel established.
19/12/04 23:57:59 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 23:57:59 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39493.
19/12/04 23:57:59 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 23:57:59 INFO data_plane.create_data_channel: Creating client data channel for localhost:46543
19/12/04 23:57:59 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 23:58:00 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 23:58:00 INFO sdk_worker.run: No more requests from control plane
19/12/04 23:58:00 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 23:58:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 23:58:00 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 23:58:00 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 23:58:00 INFO sdk_worker.run: Done consuming work.
19/12/04 23:58:00 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 23:58:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 23:58:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 23:58:00 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 23:58:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 23:58:00 INFO sdk_worker_main.main: Logging handler created.
19/12/04 23:58:00 INFO sdk_worker_main.start: Status HTTP server running at localhost:45569
19/12/04 23:58:00 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 23:58:00 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 23:58:00 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575503877.23_fd72d4ab-83ad-4431-9734-4220f7a89707', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 23:58:00 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575503877.23', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35557', 'job_port': u'0'}
19/12/04 23:58:00 INFO statecache.__init__: Creating state cache with size 0
19/12/04 23:58:00 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34677.
19/12/04 23:58:00 INFO sdk_worker.__init__: Control channel established.
19/12/04 23:58:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/04 23:58:00 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 23:58:00 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33483.
19/12/04 23:58:00 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 23:58:00 INFO data_plane.create_data_channel: Creating client data channel for localhost:44843
19/12/04 23:58:00 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 23:58:00 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 23:58:00 INFO sdk_worker.run: No more requests from control plane
19/12/04 23:58:00 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 23:58:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 23:58:00 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 23:58:00 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 23:58:00 INFO sdk_worker.run: Done consuming work.
19/12/04 23:58:00 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 23:58:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 23:58:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 23:58:01 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 23:58:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 23:58:01 INFO sdk_worker_main.main: Logging handler created.
19/12/04 23:58:01 INFO sdk_worker_main.start: Status HTTP server running at localhost:37499
19/12/04 23:58:01 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 23:58:01 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 23:58:01 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575503877.23_fd72d4ab-83ad-4431-9734-4220f7a89707', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 23:58:01 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575503877.23', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35557', 'job_port': u'0'}
19/12/04 23:58:01 INFO statecache.__init__: Creating state cache with size 0
19/12/04 23:58:01 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33519.
19/12/04 23:58:01 INFO sdk_worker.__init__: Control channel established.
19/12/04 23:58:01 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/04 23:58:01 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 23:58:01 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37421.
19/12/04 23:58:01 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 23:58:01 INFO data_plane.create_data_channel: Creating client data channel for localhost:43739
19/12/04 23:58:01 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 23:58:01 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 23:58:01 INFO sdk_worker.run: No more requests from control plane
19/12/04 23:58:01 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 23:58:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 23:58:01 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 23:58:01 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 23:58:01 INFO sdk_worker.run: Done consuming work.
19/12/04 23:58:01 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 23:58:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 23:58:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 23:58:02 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 23:58:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 23:58:02 INFO sdk_worker_main.main: Logging handler created.
19/12/04 23:58:02 INFO sdk_worker_main.start: Status HTTP server running at localhost:41013
19/12/04 23:58:02 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 23:58:02 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 23:58:02 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575503877.23_fd72d4ab-83ad-4431-9734-4220f7a89707', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 23:58:02 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575503877.23', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35557', 'job_port': u'0'}
19/12/04 23:58:02 INFO statecache.__init__: Creating state cache with size 0
19/12/04 23:58:02 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33325.
19/12/04 23:58:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/04 23:58:02 INFO sdk_worker.__init__: Control channel established.
19/12/04 23:58:02 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 23:58:02 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35155.
19/12/04 23:58:02 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 23:58:02 INFO data_plane.create_data_channel: Creating client data channel for localhost:36949
19/12/04 23:58:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 23:58:02 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 23:58:02 INFO sdk_worker.run: No more requests from control plane
19/12/04 23:58:02 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 23:58:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 23:58:02 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 23:58:02 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 23:58:02 INFO sdk_worker.run: Done consuming work.
19/12/04 23:58:02 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 23:58:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 23:58:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 23:58:02 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575503877.23_fd72d4ab-83ad-4431-9734-4220f7a89707 finished.
19/12/04 23:58:02 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/04 23:58:02 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_3e9d0f1c-e443-4f53-b963-a1cc1832e6de","basePath":"/tmp/sparktestsNnBqU"}: {}
java.io.FileNotFoundException: /tmp/sparktestsNnBqU/job_3e9d0f1c-e443-4f53-b963-a1cc1832e6de/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_ru==================== Timed out after 60 seconds. ====================

nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
# Thread: <Thread(wait_until_finish_read, started daemon 139849474766592)>

    return self._next()
# Thread: <Thread(Thread-117, started daemon 139849483159296)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 139850262898432)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139849457456896)>

# Thread: <Thread(Thread-123, started daemon 139849449064192)>

# Thread: <Thread(Thread-117, started daemon 139849483159296)>

# Thread: <_MainThread(MainThread, started 139850262898432)>

# Thread: <Thread(wait_until_finish_read, started daemon 139849474766592)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575503867.54_117f3f08-99e8-4df1-b1e1-e9fecf799d27 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 315.132s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 55s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/e36g6hotwqfh2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1695

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1695/display/redirect?page=changes>

Changes:

[ehudm] Moving to 2.19.0-SNAPSHOT on master branch.


------------------------------------------
[...truncated 1.31 MB...]
19/12/04 22:27:27 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575498446.7_6217620d-90fc-4942-ba77-f05e3fc654fc on Spark master local
19/12/04 22:27:27 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/04 22:27:27 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575498446.7_6217620d-90fc-4942-ba77-f05e3fc654fc: Pipeline translated successfully. Computing outputs
19/12/04 22:27:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 22:27:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 22:27:28 INFO sdk_worker_main.main: Logging handler created.
19/12/04 22:27:28 INFO sdk_worker_main.start: Status HTTP server running at localhost:45975
19/12/04 22:27:28 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 22:27:28 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 22:27:28 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575498446.7_6217620d-90fc-4942-ba77-f05e3fc654fc', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 22:27:28 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575498446.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46331', 'job_port': u'0'}
19/12/04 22:27:28 INFO statecache.__init__: Creating state cache with size 0
19/12/04 22:27:28 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44877.
19/12/04 22:27:28 INFO sdk_worker.__init__: Control channel established.
19/12/04 22:27:28 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/04 22:27:28 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 22:27:28 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39551.
19/12/04 22:27:28 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 22:27:28 INFO data_plane.create_data_channel: Creating client data channel for localhost:38497
19/12/04 22:27:28 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 22:27:28 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 22:27:28 INFO sdk_worker.run: No more requests from control plane
19/12/04 22:27:28 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 22:27:28 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 22:27:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 22:27:28 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 22:27:28 INFO sdk_worker.run: Done consuming work.
19/12/04 22:27:28 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 22:27:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 22:27:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 22:27:28 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 22:27:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 22:27:29 INFO sdk_worker_main.main: Logging handler created.
19/12/04 22:27:29 INFO sdk_worker_main.start: Status HTTP server running at localhost:40411
19/12/04 22:27:29 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 22:27:29 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 22:27:29 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575498446.7_6217620d-90fc-4942-ba77-f05e3fc654fc', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 22:27:29 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575498446.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46331', 'job_port': u'0'}
19/12/04 22:27:29 INFO statecache.__init__: Creating state cache with size 0
19/12/04 22:27:29 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44355.
19/12/04 22:27:29 INFO sdk_worker.__init__: Control channel established.
19/12/04 22:27:29 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 22:27:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/04 22:27:29 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39953.
19/12/04 22:27:29 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 22:27:29 INFO data_plane.create_data_channel: Creating client data channel for localhost:37581
19/12/04 22:27:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 22:27:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 22:27:29 INFO sdk_worker.run: No more requests from control plane
19/12/04 22:27:29 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 22:27:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 22:27:29 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 22:27:29 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 22:27:29 INFO sdk_worker.run: Done consuming work.
19/12/04 22:27:29 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 22:27:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 22:27:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 22:27:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 22:27:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 22:27:29 INFO sdk_worker_main.main: Logging handler created.
19/12/04 22:27:29 INFO sdk_worker_main.start: Status HTTP server running at localhost:44733
19/12/04 22:27:29 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 22:27:29 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 22:27:29 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575498446.7_6217620d-90fc-4942-ba77-f05e3fc654fc', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 22:27:29 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575498446.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46331', 'job_port': u'0'}
19/12/04 22:27:29 INFO statecache.__init__: Creating state cache with size 0
19/12/04 22:27:29 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36773.
19/12/04 22:27:29 INFO sdk_worker.__init__: Control channel established.
19/12/04 22:27:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/04 22:27:29 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 22:27:29 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34905.
19/12/04 22:27:29 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 22:27:29 INFO data_plane.create_data_channel: Creating client data channel for localhost:40779
19/12/04 22:27:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 22:27:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 22:27:29 INFO sdk_worker.run: No more requests from control plane
19/12/04 22:27:29 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 22:27:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 22:27:29 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 22:27:29 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 22:27:29 INFO sdk_worker.run: Done consuming work.
19/12/04 22:27:29 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 22:27:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 22:27:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 22:27:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 22:27:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 22:27:30 INFO sdk_worker_main.main: Logging handler created.
19/12/04 22:27:30 INFO sdk_worker_main.start: Status HTTP server running at localhost:37521
19/12/04 22:27:30 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 22:27:30 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 22:27:30 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575498446.7_6217620d-90fc-4942-ba77-f05e3fc654fc', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 22:27:30 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575498446.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46331', 'job_port': u'0'}
19/12/04 22:27:30 INFO statecache.__init__: Creating state cache with size 0
19/12/04 22:27:30 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46031.
19/12/04 22:27:30 INFO sdk_worker.__init__: Control channel established.
19/12/04 22:27:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/04 22:27:30 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 22:27:30 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35265.
19/12/04 22:27:30 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 22:27:30 INFO data_plane.create_data_channel: Creating client data channel for localhost:45389
19/12/04 22:27:30 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 22:27:30 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 22:27:30 INFO sdk_worker.run: No more requests from control plane
19/12/04 22:27:30 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 22:27:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 22:27:30 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 22:27:30 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 22:27:30 INFO sdk_worker.run: Done consuming work.
19/12/04 22:27:30 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 22:27:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 22:27:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 22:27:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 22:27:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 22:27:31 INFO sdk_worker_main.main: Logging handler created.
19/12/04 22:27:31 INFO sdk_worker_main.start: Status HTTP server running at localhost:39233
19/12/04 22:27:31 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 22:27:31 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 22:27:31 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575498446.7_6217620d-90fc-4942-ba77-f05e3fc654fc', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 22:27:31 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575498446.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46331', 'job_port': u'0'}
19/12/04 22:27:31 INFO statecache.__init__: Creating state cache with size 0
19/12/04 22:27:31 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36127.
19/12/04 22:27:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/04 22:27:31 INFO sdk_worker.__init__: Control channel established.
19/12/04 22:27:31 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 22:27:31 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44415.
19/12/04 22:27:31 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 22:27:31 INFO data_plane.create_data_channel: Creating client data channel for localhost:36371
19/12/04 22:27:31 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 22:27:31 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 22:27:31 INFO sdk_worker.run: No more requests from control plane
19/12/04 22:27:31 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 22:27:31 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 22:27:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 22:27:31 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 22:27:31 INFO sdk_worker.run: Done consuming work.
19/12/04 22:27:31 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 22:27:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 22:27:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 22:27:31 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575498446.7_6217620d-90fc-4942-ba77-f05e3fc654fc finished.
19/12/04 22:27:31 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/04 22:27:31 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_d1606f06-1da3-4b38-8c3a-56c6121087b8","basePath":"/tmp/sparktestm4vBWQ"}: {}
java.io.FileNotFoundException: /tmp/sparktestm4vBWQ/job_d1606f06-1da3-4b38-8c3a-56c6121087b8/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
==================== Timed out after 60 seconds. ====================

    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
# Thread: <Thread(wait_until_finish_read, started daemon 140240255547136)>

# Thread: <Thread(Thread-119, started daemon 140240247154432)>

    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140241037457152)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575498438.29_fe91c121-f28a-468b-b72a-f99292055e55 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 278.453s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 23s
60 actionable tasks: 57 executed, 3 from cache

Publishing build scan...
https://scans.gradle.com/s/hx2fbet5v7hzm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1694

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1694/display/redirect?page=changes>

Changes:

[lcwik] [BEAM-4287] Add trySplit API to Java restriction tracker matching Python

[lcwik] fixup!

[github] Add a comment on RLock perf issues

[lcwik] fixup!


------------------------------------------
[...truncated 1.32 MB...]
19/12/04 21:45:29 INFO sdk_worker_main.start: Status HTTP server running at localhost:44609
19/12/04 21:45:29 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 21:45:29 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 21:45:29 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575495926.47_23753869-7295-4cd5-9ca3-0c5d3d6306fa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 21:45:29 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575495926.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46985', 'job_port': u'0'}
19/12/04 21:45:29 INFO statecache.__init__: Creating state cache with size 0
19/12/04 21:45:29 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42999.
19/12/04 21:45:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/04 21:45:29 INFO sdk_worker.__init__: Control channel established.
19/12/04 21:45:29 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 21:45:29 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41635.
19/12/04 21:45:29 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 21:45:29 INFO data_plane.create_data_channel: Creating client data channel for localhost:45093
19/12/04 21:45:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 21:45:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 21:45:29 INFO sdk_worker.run: No more requests from control plane
19/12/04 21:45:29 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 21:45:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 21:45:29 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 21:45:29 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 21:45:29 INFO sdk_worker.run: Done consuming work.
19/12/04 21:45:29 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 21:45:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 21:45:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 21:45:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 21:45:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 21:45:30 INFO sdk_worker_main.main: Logging handler created.
19/12/04 21:45:30 INFO sdk_worker_main.start: Status HTTP server running at localhost:34477
19/12/04 21:45:30 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 21:45:30 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 21:45:30 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575495926.47_23753869-7295-4cd5-9ca3-0c5d3d6306fa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 21:45:30 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575495926.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46985', 'job_port': u'0'}
19/12/04 21:45:30 INFO statecache.__init__: Creating state cache with size 0
19/12/04 21:45:30 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34541.
19/12/04 21:45:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/04 21:45:30 INFO sdk_worker.__init__: Control channel established.
19/12/04 21:45:30 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 21:45:30 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41721.
19/12/04 21:45:30 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 21:45:30 INFO data_plane.create_data_channel: Creating client data channel for localhost:41203
19/12/04 21:45:30 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 21:45:30 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 21:45:30 INFO sdk_worker.run: No more requests from control plane
19/12/04 21:45:30 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 21:45:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 21:45:30 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 21:45:30 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 21:45:30 INFO sdk_worker.run: Done consuming work.
19/12/04 21:45:30 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 21:45:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 21:45:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 21:45:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 21:45:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 21:45:31 INFO sdk_worker_main.main: Logging handler created.
19/12/04 21:45:31 INFO sdk_worker_main.start: Status HTTP server running at localhost:41469
19/12/04 21:45:31 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 21:45:31 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 21:45:31 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575495926.47_23753869-7295-4cd5-9ca3-0c5d3d6306fa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 21:45:31 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575495926.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46985', 'job_port': u'0'}
19/12/04 21:45:31 INFO statecache.__init__: Creating state cache with size 0
19/12/04 21:45:31 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33105.
19/12/04 21:45:31 INFO sdk_worker.__init__: Control channel established.
19/12/04 21:45:31 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 21:45:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/04 21:45:31 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37851.
19/12/04 21:45:31 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 21:45:31 INFO data_plane.create_data_channel: Creating client data channel for localhost:39851
19/12/04 21:45:31 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 21:45:31 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 21:45:31 INFO sdk_worker.run: No more requests from control plane
19/12/04 21:45:31 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 21:45:31 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 21:45:31 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 21:45:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 21:45:31 INFO sdk_worker.run: Done consuming work.
19/12/04 21:45:31 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 21:45:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 21:45:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 21:45:31 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 21:45:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 21:45:32 INFO sdk_worker_main.main: Logging handler created.
19/12/04 21:45:32 INFO sdk_worker_main.start: Status HTTP server running at localhost:42987
19/12/04 21:45:32 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 21:45:32 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 21:45:32 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575495926.47_23753869-7295-4cd5-9ca3-0c5d3d6306fa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 21:45:32 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575495926.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46985', 'job_port': u'0'}
19/12/04 21:45:32 INFO statecache.__init__: Creating state cache with size 0
19/12/04 21:45:32 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44831.
19/12/04 21:45:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/04 21:45:32 INFO sdk_worker.__init__: Control channel established.
19/12/04 21:45:32 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 21:45:32 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33619.
19/12/04 21:45:32 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 21:45:32 INFO data_plane.create_data_channel: Creating client data channel for localhost:34889
19/12/04 21:45:32 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 21:45:32 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 21:45:32 INFO sdk_worker.run: No more requests from control plane
19/12/04 21:45:32 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 21:45:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 21:45:32 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 21:45:32 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 21:45:32 INFO sdk_worker.run: Done consuming work.
19/12/04 21:45:32 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 21:45:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 21:45:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 21:45:32 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575495926.47_23753869-7295-4cd5-9ca3-0c5d3d6306fa finished.
19/12/04 21:45:32 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/04 21:45:32 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_12ccfbc0-b337-4b65-8bc3-0cb9db77b6a6","basePath":"/tmp/sparktestY0PgyL"}: {}
java.io.FileNotFoundException: /tmp/sparktestY0PgyL/job_12ccfbc0-b337-4b65-8bc3-0cb9db77b6a6/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
==================== Timed out after 60 seconds. ====================

    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 140518830888704)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(Thread-120, started daemon 140518847674112)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <_MainThread(MainThread, started 140519969101568)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apach# Thread: <Thread(wait_until_finish_read, started daemon 140518814103296)>

e_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(Thread-126, started daemon 140518822496000)>

# Thread: <_MainThread(MainThread, started 140519969101568)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575495916.08_6e1630e4-802b-486b-8101-4dfbd37226c9 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <Thread(Thread-120, started daemon 140518847674112)>

----------------------------------------------------------------------
Ran 38 tests in 337.548s

# Thread: <Thread(wait_until_finish_read, started daemon 140518830888704)>
FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 37s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/s6utnjquisoy4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1693

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1693/display/redirect?page=changes>

Changes:

[ehudm] [BEAM-8662] Remove Py3 annotations support from

[github] [BEAM-8481] Revert the increase in Postcommit timeout


------------------------------------------
[...truncated 1.32 MB...]
19/12/04 20:35:26 INFO sdk_worker_main.start: Status HTTP server running at localhost:45247
19/12/04 20:35:26 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 20:35:26 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 20:35:26 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575491723.72_9378cec0-a42a-4483-82d5-5cc7ae12e54d', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 20:35:26 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575491723.72', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41411', 'job_port': u'0'}
19/12/04 20:35:26 INFO statecache.__init__: Creating state cache with size 0
19/12/04 20:35:26 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42111.
19/12/04 20:35:26 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/04 20:35:26 INFO sdk_worker.__init__: Control channel established.
19/12/04 20:35:26 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 20:35:26 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34755.
19/12/04 20:35:26 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 20:35:26 INFO data_plane.create_data_channel: Creating client data channel for localhost:43895
19/12/04 20:35:26 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 20:35:26 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 20:35:26 INFO sdk_worker.run: No more requests from control plane
19/12/04 20:35:26 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 20:35:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 20:35:26 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 20:35:26 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 20:35:26 INFO sdk_worker.run: Done consuming work.
19/12/04 20:35:26 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 20:35:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 20:35:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 20:35:26 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 20:35:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 20:35:27 INFO sdk_worker_main.main: Logging handler created.
19/12/04 20:35:27 INFO sdk_worker_main.start: Status HTTP server running at localhost:41337
19/12/04 20:35:27 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 20:35:27 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 20:35:27 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575491723.72_9378cec0-a42a-4483-82d5-5cc7ae12e54d', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 20:35:27 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575491723.72', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41411', 'job_port': u'0'}
19/12/04 20:35:27 INFO statecache.__init__: Creating state cache with size 0
19/12/04 20:35:27 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43403.
19/12/04 20:35:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/04 20:35:27 INFO sdk_worker.__init__: Control channel established.
19/12/04 20:35:27 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 20:35:27 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45905.
19/12/04 20:35:27 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 20:35:27 INFO data_plane.create_data_channel: Creating client data channel for localhost:44551
19/12/04 20:35:27 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 20:35:27 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 20:35:27 INFO sdk_worker.run: No more requests from control plane
19/12/04 20:35:27 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 20:35:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 20:35:27 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 20:35:27 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 20:35:27 INFO sdk_worker.run: Done consuming work.
19/12/04 20:35:27 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 20:35:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 20:35:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 20:35:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 20:35:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 20:35:28 INFO sdk_worker_main.main: Logging handler created.
19/12/04 20:35:28 INFO sdk_worker_main.start: Status HTTP server running at localhost:39731
19/12/04 20:35:28 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 20:35:28 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 20:35:28 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575491723.72_9378cec0-a42a-4483-82d5-5cc7ae12e54d', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 20:35:28 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575491723.72', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41411', 'job_port': u'0'}
19/12/04 20:35:28 INFO statecache.__init__: Creating state cache with size 0
19/12/04 20:35:28 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46547.
19/12/04 20:35:28 INFO sdk_worker.__init__: Control channel established.
19/12/04 20:35:28 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/04 20:35:28 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 20:35:28 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41475.
19/12/04 20:35:28 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 20:35:28 INFO data_plane.create_data_channel: Creating client data channel for localhost:33213
19/12/04 20:35:28 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 20:35:28 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 20:35:28 INFO sdk_worker.run: No more requests from control plane
19/12/04 20:35:28 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 20:35:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 20:35:28 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 20:35:28 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 20:35:28 INFO sdk_worker.run: Done consuming work.
19/12/04 20:35:28 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 20:35:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 20:35:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 20:35:28 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 20:35:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 20:35:29 INFO sdk_worker_main.main: Logging handler created.
19/12/04 20:35:29 INFO sdk_worker_main.start: Status HTTP server running at localhost:43651
19/12/04 20:35:29 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 20:35:29 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 20:35:29 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575491723.72_9378cec0-a42a-4483-82d5-5cc7ae12e54d', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 20:35:29 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575491723.72', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41411', 'job_port': u'0'}
19/12/04 20:35:29 INFO statecache.__init__: Creating state cache with size 0
19/12/04 20:35:29 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40385.
19/12/04 20:35:29 INFO sdk_worker.__init__: Control channel established.
19/12/04 20:35:29 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 20:35:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/04 20:35:29 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41981.
19/12/04 20:35:29 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 20:35:29 INFO data_plane.create_data_channel: Creating client data channel for localhost:39379
19/12/04 20:35:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 20:35:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 20:35:29 INFO sdk_worker.run: No more requests from control plane
19/12/04 20:35:29 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 20:35:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 20:35:29 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 20:35:29 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 20:35:29 INFO sdk_worker.run: Done consuming work.
19/12/04 20:35:29 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 20:35:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 20:35:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 20:35:29 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575491723.72_9378cec0-a42a-4483-82d5-5cc7ae12e54d finished.
19/12/04 20:35:29 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/04 20:35:29 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_374907c7-30f5-48b9-9de6-f3cc677df2a7","basePath":"/tmp/sparktestXF497R"}: {}
java.io.FileNotFoundException: /tmp/sparktestXF497R/job_374907c7-30f5-48b9-9de6-f3cc677df2a7/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140239925823232)>

# Thread: <Thread(Thread-119, started daemon 140239648880384)>

BaseException: Timed out after 60 seconds.

======================================================================
# Thread: <_MainThread(MainThread, started 140240444344064)>
==================== Timed out after 60 seconds. ====================

ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 140239630259968)>

  File "apache_beam/runners/portability/portable_ru# Thread: <Thread(Thread-125, started daemon 140239638652672)>

nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-119, started daemon 140239648880384)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140240444344064)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140239925823232)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575491713.33_7b1ea2e6-8a34-43f5-81ee-cd1f99a33812 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 355.118s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 51s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/f4ns76y4jvqju

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1692

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1692/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/04 18:21:56 INFO sdk_worker_main.start: Status HTTP server running at localhost:32909
19/12/04 18:21:56 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 18:21:56 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 18:21:56 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575483714.08_63100ec5-fdc3-4adf-bd50-6e11b275fb44', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 18:21:56 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575483714.08', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58617', 'job_port': u'0'}
19/12/04 18:21:56 INFO statecache.__init__: Creating state cache with size 0
19/12/04 18:21:56 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39321.
19/12/04 18:21:56 INFO sdk_worker.__init__: Control channel established.
19/12/04 18:21:56 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 18:21:56 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/04 18:21:56 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42065.
19/12/04 18:21:56 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 18:21:56 INFO data_plane.create_data_channel: Creating client data channel for localhost:42201
19/12/04 18:21:56 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 18:21:56 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 18:21:56 INFO sdk_worker.run: No more requests from control plane
19/12/04 18:21:56 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 18:21:56 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 18:21:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 18:21:56 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 18:21:56 INFO sdk_worker.run: Done consuming work.
19/12/04 18:21:56 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 18:21:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 18:21:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 18:21:56 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 18:21:57 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 18:21:57 INFO sdk_worker_main.main: Logging handler created.
19/12/04 18:21:57 INFO sdk_worker_main.start: Status HTTP server running at localhost:41679
19/12/04 18:21:57 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 18:21:57 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 18:21:57 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575483714.08_63100ec5-fdc3-4adf-bd50-6e11b275fb44', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 18:21:57 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575483714.08', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58617', 'job_port': u'0'}
19/12/04 18:21:57 INFO statecache.__init__: Creating state cache with size 0
19/12/04 18:21:57 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38849.
19/12/04 18:21:57 INFO sdk_worker.__init__: Control channel established.
19/12/04 18:21:57 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/04 18:21:57 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 18:21:57 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42513.
19/12/04 18:21:57 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 18:21:57 INFO data_plane.create_data_channel: Creating client data channel for localhost:40479
19/12/04 18:21:57 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 18:21:57 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 18:21:57 INFO sdk_worker.run: No more requests from control plane
19/12/04 18:21:57 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 18:21:57 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 18:21:57 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 18:21:57 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 18:21:57 INFO sdk_worker.run: Done consuming work.
19/12/04 18:21:57 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 18:21:57 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 18:21:57 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 18:21:57 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 18:21:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 18:21:58 INFO sdk_worker_main.main: Logging handler created.
19/12/04 18:21:58 INFO sdk_worker_main.start: Status HTTP server running at localhost:33177
19/12/04 18:21:58 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 18:21:58 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 18:21:58 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575483714.08_63100ec5-fdc3-4adf-bd50-6e11b275fb44', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 18:21:58 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575483714.08', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58617', 'job_port': u'0'}
19/12/04 18:21:58 INFO statecache.__init__: Creating state cache with size 0
19/12/04 18:21:58 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34933.
19/12/04 18:21:58 INFO sdk_worker.__init__: Control channel established.
19/12/04 18:21:58 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 18:21:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/04 18:21:58 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39035.
19/12/04 18:21:58 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 18:21:58 INFO data_plane.create_data_channel: Creating client data channel for localhost:42339
19/12/04 18:21:58 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 18:21:58 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 18:21:58 INFO sdk_worker.run: No more requests from control plane
19/12/04 18:21:58 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 18:21:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 18:21:58 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 18:21:58 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 18:21:58 INFO sdk_worker.run: Done consuming work.
19/12/04 18:21:58 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 18:21:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 18:21:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 18:21:58 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 18:21:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 18:21:59 INFO sdk_worker_main.main: Logging handler created.
19/12/04 18:21:59 INFO sdk_worker_main.start: Status HTTP server running at localhost:41455
19/12/04 18:21:59 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 18:21:59 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 18:21:59 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575483714.08_63100ec5-fdc3-4adf-bd50-6e11b275fb44', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 18:21:59 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575483714.08', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58617', 'job_port': u'0'}
19/12/04 18:21:59 INFO statecache.__init__: Creating state cache with size 0
19/12/04 18:21:59 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34365.
19/12/04 18:21:59 INFO sdk_worker.__init__: Control channel established.
19/12/04 18:21:59 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/04 18:21:59 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 18:21:59 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44115.
19/12/04 18:21:59 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 18:21:59 INFO data_plane.create_data_channel: Creating client data channel for localhost:37205
19/12/04 18:21:59 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 18:21:59 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 18:21:59 INFO sdk_worker.run: No more requests from control plane
19/12/04 18:21:59 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 18:21:59 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 18:21:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 18:21:59 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 18:21:59 INFO sdk_worker.run: Done consuming work.
19/12/04 18:21:59 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 18:21:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 18:21:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 18:21:59 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575483714.08_63100ec5-fdc3-4adf-bd50-6e11b275fb44 finished.
19/12/04 18:21:59 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/04 18:21:59 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_9efbd784-b92d-4c0a-92d5-23fe331890cc","basePath":"/tmp/sparktesthIEktb"}: {}
java.io.FileNotFoundException: /tmp/sparktesthIEktb/job_9efbd784-b92d-4c0a-92d5-23fe331890cc/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
==================== Timed out after 60 seconds. ====================
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 139944758126336)>

# Thread: <Thread(Thread-118, started daemon 139944749733632)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 139945546258176)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 139944127362816)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-124, started daemon 139944740292352)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 139945546258176)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(Thread-118, started daemon 139944749733632)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139944758126336)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575483704.65_6c1f8825-ec69-4452-a1a3-4756f4e7c47f failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 323.691s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 36s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/znwea4wcnnbi2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1691

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1691/display/redirect?page=changes>

Changes:

[crites] Adds translation support for TestStream to Dataflow Java runner.

[crites] Formatting cleanup using gradlew spotnessApply.


------------------------------------------
[...truncated 1.32 MB...]
19/12/04 17:25:16 INFO sdk_worker_main.start: Status HTTP server running at localhost:33789
19/12/04 17:25:16 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 17:25:16 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 17:25:16 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575480314.42_1b58c62c-ffc5-4f99-82c3-bc99c25b4784', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 17:25:16 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575480314.42', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:33645', 'job_port': u'0'}
19/12/04 17:25:16 INFO statecache.__init__: Creating state cache with size 0
19/12/04 17:25:16 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34211.
19/12/04 17:25:16 INFO sdk_worker.__init__: Control channel established.
19/12/04 17:25:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/04 17:25:16 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 17:25:16 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34839.
19/12/04 17:25:16 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 17:25:16 INFO data_plane.create_data_channel: Creating client data channel for localhost:41425
19/12/04 17:25:16 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 17:25:17 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 17:25:17 INFO sdk_worker.run: No more requests from control plane
19/12/04 17:25:17 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 17:25:17 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 17:25:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 17:25:17 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 17:25:17 INFO sdk_worker.run: Done consuming work.
19/12/04 17:25:17 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 17:25:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 17:25:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 17:25:17 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 17:25:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 17:25:17 INFO sdk_worker_main.main: Logging handler created.
19/12/04 17:25:17 INFO sdk_worker_main.start: Status HTTP server running at localhost:36555
19/12/04 17:25:17 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 17:25:17 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 17:25:17 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575480314.42_1b58c62c-ffc5-4f99-82c3-bc99c25b4784', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 17:25:17 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575480314.42', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:33645', 'job_port': u'0'}
19/12/04 17:25:17 INFO statecache.__init__: Creating state cache with size 0
19/12/04 17:25:17 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33543.
19/12/04 17:25:17 INFO sdk_worker.__init__: Control channel established.
19/12/04 17:25:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/04 17:25:17 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 17:25:17 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44123.
19/12/04 17:25:17 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 17:25:17 INFO data_plane.create_data_channel: Creating client data channel for localhost:40993
19/12/04 17:25:17 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 17:25:17 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 17:25:17 INFO sdk_worker.run: No more requests from control plane
19/12/04 17:25:17 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 17:25:17 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 17:25:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 17:25:17 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 17:25:17 INFO sdk_worker.run: Done consuming work.
19/12/04 17:25:17 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 17:25:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 17:25:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 17:25:18 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 17:25:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 17:25:18 INFO sdk_worker_main.main: Logging handler created.
19/12/04 17:25:18 INFO sdk_worker_main.start: Status HTTP server running at localhost:35015
19/12/04 17:25:18 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 17:25:18 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 17:25:18 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575480314.42_1b58c62c-ffc5-4f99-82c3-bc99c25b4784', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 17:25:18 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575480314.42', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:33645', 'job_port': u'0'}
19/12/04 17:25:18 INFO statecache.__init__: Creating state cache with size 0
19/12/04 17:25:18 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44905.
19/12/04 17:25:18 INFO sdk_worker.__init__: Control channel established.
19/12/04 17:25:18 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/04 17:25:18 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 17:25:18 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33209.
19/12/04 17:25:18 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 17:25:18 INFO data_plane.create_data_channel: Creating client data channel for localhost:39857
19/12/04 17:25:18 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 17:25:18 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 17:25:18 INFO sdk_worker.run: No more requests from control plane
19/12/04 17:25:18 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 17:25:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 17:25:18 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 17:25:18 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 17:25:18 INFO sdk_worker.run: Done consuming work.
19/12/04 17:25:18 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 17:25:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 17:25:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 17:25:19 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 17:25:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 17:25:19 INFO sdk_worker_main.main: Logging handler created.
19/12/04 17:25:19 INFO sdk_worker_main.start: Status HTTP server running at localhost:45771
19/12/04 17:25:19 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 17:25:19 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 17:25:19 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575480314.42_1b58c62c-ffc5-4f99-82c3-bc99c25b4784', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 17:25:19 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575480314.42', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:33645', 'job_port': u'0'}
19/12/04 17:25:19 INFO statecache.__init__: Creating state cache with size 0
19/12/04 17:25:19 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34843.
19/12/04 17:25:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/04 17:25:19 INFO sdk_worker.__init__: Control channel established.
19/12/04 17:25:19 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 17:25:19 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33845.
19/12/04 17:25:19 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 17:25:19 INFO data_plane.create_data_channel: Creating client data channel for localhost:42749
19/12/04 17:25:19 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 17:25:19 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 17:25:19 INFO sdk_worker.run: No more requests from control plane
19/12/04 17:25:19 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 17:25:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 17:25:19 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 17:25:19 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 17:25:19 INFO sdk_worker.run: Done consuming work.
19/12/04 17:25:19 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 17:25:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 17:25:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 17:25:19 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575480314.42_1b58c62c-ffc5-4f99-82c3-bc99c25b4784 finished.
19/12/04 17:25:19 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/04 17:25:19 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_0b0fe65a-ecc9-40fb-8b44-e93e64c64324","basePath":"/tmp/sparktestZ9A4JM"}: {}
java.io.FileNotFoundException: /tmp/sparktestZ9A4JM/job_0b0fe65a-ecc9-40fb-8b44-e93e64c64324/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139705756677888)>

# Thread: <Thread(Thread-119, started daemon 139706107401984)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.


# Thread: <_MainThread(MainThread, started 139706886641408)>
==================== Timed out after 60 seconds. ====================

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(wait_until_finish_read, started daemon 139705739892480)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575480305.03_bdef18ec-859e-483e-850b-73a697fe50a2 failed in state FAILED: java.lang.UnsupportedOperationException: The A# Thread: <Thread(Thread-125, started daemon 139705731499776)>

# Thread: <Thread(Thread-119, started daemon 139706107401984)>

ctiveBundle does not have a registered bundle checkpoint handler.

# Thread: <_MainThread(MainThread, started 139706886641408)>

----------------------------------------------------------------------
Ran 38 tests in 315.803s

# Thread: <Thread(wait_until_finish_read, started daemon 139705756677888)>
FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 39s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/5zmfquzecif5w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1690

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1690/display/redirect?page=changes>

Changes:

[suztomo] Hadoop client 2.8

[suztomo] Elasticsearch-hadoop's use of commons-httpclient

[suztomo] Hardcoding dfs.nameservices

[suztomo] Updated comment

[suztomo] Fixed unused import


------------------------------------------
[...truncated 1.32 MB...]
19/12/04 16:52:30 INFO sdk_worker_main.start: Status HTTP server running at localhost:33491
19/12/04 16:52:30 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 16:52:30 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 16:52:30 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575478347.63_14c488d0-88bc-4fc7-a9af-471a15b9838e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 16:52:30 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575478347.63', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59393', 'job_port': u'0'}
19/12/04 16:52:30 INFO statecache.__init__: Creating state cache with size 0
19/12/04 16:52:30 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40291.
19/12/04 16:52:30 INFO sdk_worker.__init__: Control channel established.
19/12/04 16:52:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/04 16:52:30 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 16:52:30 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40911.
19/12/04 16:52:30 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 16:52:30 INFO data_plane.create_data_channel: Creating client data channel for localhost:41867
19/12/04 16:52:30 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 16:52:30 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 16:52:30 INFO sdk_worker.run: No more requests from control plane
19/12/04 16:52:30 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 16:52:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 16:52:30 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 16:52:30 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 16:52:30 INFO sdk_worker.run: Done consuming work.
19/12/04 16:52:30 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 16:52:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 16:52:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 16:52:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 16:52:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 16:52:31 INFO sdk_worker_main.main: Logging handler created.
19/12/04 16:52:31 INFO sdk_worker_main.start: Status HTTP server running at localhost:39673
19/12/04 16:52:31 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 16:52:31 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 16:52:31 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575478347.63_14c488d0-88bc-4fc7-a9af-471a15b9838e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 16:52:31 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575478347.63', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59393', 'job_port': u'0'}
19/12/04 16:52:31 INFO statecache.__init__: Creating state cache with size 0
19/12/04 16:52:31 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33659.
19/12/04 16:52:31 INFO sdk_worker.__init__: Control channel established.
19/12/04 16:52:31 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 16:52:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/04 16:52:31 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34731.
19/12/04 16:52:31 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 16:52:31 INFO data_plane.create_data_channel: Creating client data channel for localhost:33181
19/12/04 16:52:31 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 16:52:31 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 16:52:31 INFO sdk_worker.run: No more requests from control plane
19/12/04 16:52:31 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 16:52:31 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 16:52:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 16:52:31 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 16:52:31 INFO sdk_worker.run: Done consuming work.
19/12/04 16:52:31 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 16:52:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 16:52:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 16:52:31 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 16:52:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 16:52:32 INFO sdk_worker_main.main: Logging handler created.
19/12/04 16:52:32 INFO sdk_worker_main.start: Status HTTP server running at localhost:40707
19/12/04 16:52:32 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 16:52:32 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 16:52:32 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575478347.63_14c488d0-88bc-4fc7-a9af-471a15b9838e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 16:52:32 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575478347.63', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59393', 'job_port': u'0'}
19/12/04 16:52:32 INFO statecache.__init__: Creating state cache with size 0
19/12/04 16:52:32 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39851.
19/12/04 16:52:32 INFO sdk_worker.__init__: Control channel established.
19/12/04 16:52:32 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 16:52:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/04 16:52:32 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42809.
19/12/04 16:52:32 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 16:52:32 INFO data_plane.create_data_channel: Creating client data channel for localhost:40723
19/12/04 16:52:32 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 16:52:32 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 16:52:32 INFO sdk_worker.run: No more requests from control plane
19/12/04 16:52:32 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 16:52:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 16:52:32 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 16:52:32 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 16:52:32 INFO sdk_worker.run: Done consuming work.
19/12/04 16:52:32 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 16:52:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 16:52:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 16:52:32 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 16:52:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 16:52:33 INFO sdk_worker_main.main: Logging handler created.
19/12/04 16:52:33 INFO sdk_worker_main.start: Status HTTP server running at localhost:41565
19/12/04 16:52:33 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 16:52:33 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 16:52:33 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575478347.63_14c488d0-88bc-4fc7-a9af-471a15b9838e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 16:52:33 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575478347.63', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59393', 'job_port': u'0'}
19/12/04 16:52:33 INFO statecache.__init__: Creating state cache with size 0
19/12/04 16:52:33 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36277.
19/12/04 16:52:33 INFO sdk_worker.__init__: Control channel established.
19/12/04 16:52:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/04 16:52:33 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 16:52:33 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34781.
19/12/04 16:52:33 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 16:52:33 INFO data_plane.create_data_channel: Creating client data channel for localhost:42649
19/12/04 16:52:33 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 16:52:33 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 16:52:33 INFO sdk_worker.run: No more requests from control plane
19/12/04 16:52:33 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 16:52:33 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 16:52:33 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 16:52:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 16:52:33 INFO sdk_worker.run: Done consuming work.
19/12/04 16:52:33 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 16:52:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 16:52:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 16:52:33 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575478347.63_14c488d0-88bc-4fc7-a9af-471a15b9838e finished.
19/12/04 16:52:33 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/04 16:52:33 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_1c607c90-51cf-4a3e-a5b9-d13f7871e5e2","basePath":"/tmp/sparktestqcNM7G"}: {}
java.io.FileNotFoundException: /tmp/sparktestqcNM7G/job_1c607c90-51cf-4a3e-a5b9-d13f7871e5e2/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
==================== Timed out after 60 seconds. ====================
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait

    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(wait_until_finish_read, started daemon 139865125283584)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-117, started daemon 139865108498176)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <_MainThread(MainThread, started 139865905022720)>
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575478337.44_133dbe3e-7959-4758-b7d2-051857ce95ed failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 338.990s

FAILED (errors=3, skipped=9)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139865090926336)>

# Thread: <Thread(Thread-123, started daemon 139865099319040)>

# Thread: <Thread(Thread-117, started daemon 139865108498176)>

# Thread: <_MainThread(MainThread, started 139865905022720)>

# Thread: <Thread(wait_until_finish_read, started daemon 139865125283584)>

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 55s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/7g2xypz3sucxg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1689

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1689/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/04 12:10:29 INFO sdk_worker_main.start: Status HTTP server running at localhost:35519
19/12/04 12:10:29 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 12:10:29 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 12:10:29 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575461426.87_aaf1a1fa-b749-4e92-a54b-20f9463a5cb4', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 12:10:29 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575461426.87', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:45413', 'job_port': u'0'}
19/12/04 12:10:29 INFO statecache.__init__: Creating state cache with size 0
19/12/04 12:10:29 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38141.
19/12/04 12:10:29 INFO sdk_worker.__init__: Control channel established.
19/12/04 12:10:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/04 12:10:29 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 12:10:29 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34925.
19/12/04 12:10:29 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 12:10:29 INFO data_plane.create_data_channel: Creating client data channel for localhost:44557
19/12/04 12:10:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 12:10:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 12:10:29 INFO sdk_worker.run: No more requests from control plane
19/12/04 12:10:29 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 12:10:29 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 12:10:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 12:10:29 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 12:10:29 INFO sdk_worker.run: Done consuming work.
19/12/04 12:10:29 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 12:10:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 12:10:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 12:10:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 12:10:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 12:10:30 INFO sdk_worker_main.main: Logging handler created.
19/12/04 12:10:30 INFO sdk_worker_main.start: Status HTTP server running at localhost:40119
19/12/04 12:10:30 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 12:10:30 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 12:10:30 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575461426.87_aaf1a1fa-b749-4e92-a54b-20f9463a5cb4', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 12:10:30 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575461426.87', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:45413', 'job_port': u'0'}
19/12/04 12:10:30 INFO statecache.__init__: Creating state cache with size 0
19/12/04 12:10:30 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39315.
19/12/04 12:10:30 INFO sdk_worker.__init__: Control channel established.
19/12/04 12:10:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/04 12:10:30 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 12:10:30 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33531.
19/12/04 12:10:30 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 12:10:30 INFO data_plane.create_data_channel: Creating client data channel for localhost:45907
19/12/04 12:10:30 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 12:10:30 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 12:10:30 INFO sdk_worker.run: No more requests from control plane
19/12/04 12:10:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 12:10:30 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 12:10:30 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 12:10:30 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 12:10:30 INFO sdk_worker.run: Done consuming work.
19/12/04 12:10:30 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 12:10:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 12:10:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 12:10:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 12:10:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 12:10:31 INFO sdk_worker_main.main: Logging handler created.
19/12/04 12:10:31 INFO sdk_worker_main.start: Status HTTP server running at localhost:34501
19/12/04 12:10:31 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 12:10:31 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 12:10:31 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575461426.87_aaf1a1fa-b749-4e92-a54b-20f9463a5cb4', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 12:10:31 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575461426.87', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:45413', 'job_port': u'0'}
19/12/04 12:10:31 INFO statecache.__init__: Creating state cache with size 0
19/12/04 12:10:31 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44337.
19/12/04 12:10:31 INFO sdk_worker.__init__: Control channel established.
19/12/04 12:10:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/04 12:10:31 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 12:10:31 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34801.
19/12/04 12:10:31 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 12:10:31 INFO data_plane.create_data_channel: Creating client data channel for localhost:42231
19/12/04 12:10:31 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 12:10:31 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 12:10:31 INFO sdk_worker.run: No more requests from control plane
19/12/04 12:10:31 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 12:10:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 12:10:31 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 12:10:31 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 12:10:31 INFO sdk_worker.run: Done consuming work.
19/12/04 12:10:31 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 12:10:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 12:10:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 12:10:31 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 12:10:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 12:10:32 INFO sdk_worker_main.main: Logging handler created.
19/12/04 12:10:32 INFO sdk_worker_main.start: Status HTTP server running at localhost:46801
19/12/04 12:10:32 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 12:10:32 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 12:10:32 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575461426.87_aaf1a1fa-b749-4e92-a54b-20f9463a5cb4', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 12:10:32 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575461426.87', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:45413', 'job_port': u'0'}
19/12/04 12:10:32 INFO statecache.__init__: Creating state cache with size 0
19/12/04 12:10:32 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42001.
19/12/04 12:10:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/04 12:10:32 INFO sdk_worker.__init__: Control channel established.
19/12/04 12:10:32 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 12:10:32 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44249.
19/12/04 12:10:32 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 12:10:32 INFO data_plane.create_data_channel: Creating client data channel for localhost:37121
19/12/04 12:10:32 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 12:10:32 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 12:10:32 INFO sdk_worker.run: No more requests from control plane
19/12/04 12:10:32 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 12:10:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 12:10:32 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 12:10:32 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 12:10:32 INFO sdk_worker.run: Done consuming work.
19/12/04 12:10:32 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 12:10:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 12:10:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 12:10:32 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575461426.87_aaf1a1fa-b749-4e92-a54b-20f9463a5cb4 finished.
19/12/04 12:10:32 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/04 12:10:32 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_d3c627bb-b8a0-41dd-89b2-3d9a35210dd0","basePath":"/tmp/sparktestV7ikWu"}: {}
java.io.FileNotFoundException: /tmp/sparktestV7ikWu/job_d3c627bb-b8a0-41dd-89b2-3d9a35210dd0/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139848725260032)>

    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-120, started daemon 139848716867328)>

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
# Thread: <_MainThread(MainThread, started 139849504999168)>
==================== Timed out after 60 seconds. ====================

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 139848220403456)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-126, started daemon 139848228796160)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 139849504999168)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-120, started daemon 139848716867328)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
# Thread: <Thread(wait_until_finish_read, started daemon 139848725260032)>
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575461416.93_cf58e32e-8b51-46e2-b0cf-60b7a99bbcfe failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 319.111s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 57s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/hjjisjtwhc74u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1688

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1688/display/redirect?page=changes>

Changes:

[sambvfx] [BEAM-8836] Make ExternalTransform unique_name unique

[sambvfx] add simple unique_name test; remove all uses of

[sambvfx] fixup: pylint fix


------------------------------------------
[...truncated 1.32 MB...]
19/12/04 11:48:14 INFO sdk_worker_main.start: Status HTTP server running at localhost:46113
19/12/04 11:48:14 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 11:48:14 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 11:48:14 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575460091.98_e565b90c-7d30-49fe-82ab-72eaf5d26b41', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 11:48:14 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575460091.98', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36225', 'job_port': u'0'}
19/12/04 11:48:14 INFO statecache.__init__: Creating state cache with size 0
19/12/04 11:48:14 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37995.
19/12/04 11:48:14 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/04 11:48:14 INFO sdk_worker.__init__: Control channel established.
19/12/04 11:48:14 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 11:48:14 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37483.
19/12/04 11:48:14 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 11:48:14 INFO data_plane.create_data_channel: Creating client data channel for localhost:36253
19/12/04 11:48:14 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 11:48:14 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 11:48:14 INFO sdk_worker.run: No more requests from control plane
19/12/04 11:48:14 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 11:48:14 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 11:48:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:48:14 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 11:48:14 INFO sdk_worker.run: Done consuming work.
19/12/04 11:48:14 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 11:48:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 11:48:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:48:15 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 11:48:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 11:48:15 INFO sdk_worker_main.main: Logging handler created.
19/12/04 11:48:15 INFO sdk_worker_main.start: Status HTTP server running at localhost:36903
19/12/04 11:48:15 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 11:48:15 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 11:48:15 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575460091.98_e565b90c-7d30-49fe-82ab-72eaf5d26b41', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 11:48:15 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575460091.98', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36225', 'job_port': u'0'}
19/12/04 11:48:15 INFO statecache.__init__: Creating state cache with size 0
19/12/04 11:48:15 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43227.
19/12/04 11:48:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/04 11:48:15 INFO sdk_worker.__init__: Control channel established.
19/12/04 11:48:15 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 11:48:15 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43895.
19/12/04 11:48:15 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 11:48:15 INFO data_plane.create_data_channel: Creating client data channel for localhost:34779
19/12/04 11:48:15 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 11:48:15 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 11:48:15 INFO sdk_worker.run: No more requests from control plane
19/12/04 11:48:15 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 11:48:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:48:15 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 11:48:15 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 11:48:15 INFO sdk_worker.run: Done consuming work.
19/12/04 11:48:15 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 11:48:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 11:48:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:48:16 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 11:48:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 11:48:16 INFO sdk_worker_main.main: Logging handler created.
19/12/04 11:48:16 INFO sdk_worker_main.start: Status HTTP server running at localhost:33577
19/12/04 11:48:16 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 11:48:16 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 11:48:16 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575460091.98_e565b90c-7d30-49fe-82ab-72eaf5d26b41', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 11:48:16 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575460091.98', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36225', 'job_port': u'0'}
19/12/04 11:48:16 INFO statecache.__init__: Creating state cache with size 0
19/12/04 11:48:16 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38715.
19/12/04 11:48:16 INFO sdk_worker.__init__: Control channel established.
19/12/04 11:48:16 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 11:48:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/04 11:48:16 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33415.
19/12/04 11:48:16 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 11:48:16 INFO data_plane.create_data_channel: Creating client data channel for localhost:38723
19/12/04 11:48:16 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 11:48:16 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 11:48:16 INFO sdk_worker.run: No more requests from control plane
19/12/04 11:48:16 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 11:48:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:48:16 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 11:48:16 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 11:48:16 INFO sdk_worker.run: Done consuming work.
19/12/04 11:48:16 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 11:48:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 11:48:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:48:17 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 11:48:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 11:48:17 INFO sdk_worker_main.main: Logging handler created.
19/12/04 11:48:17 INFO sdk_worker_main.start: Status HTTP server running at localhost:44317
19/12/04 11:48:17 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 11:48:17 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 11:48:17 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575460091.98_e565b90c-7d30-49fe-82ab-72eaf5d26b41', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 11:48:17 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575460091.98', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36225', 'job_port': u'0'}
19/12/04 11:48:17 INFO statecache.__init__: Creating state cache with size 0
19/12/04 11:48:17 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38457.
19/12/04 11:48:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/04 11:48:17 INFO sdk_worker.__init__: Control channel established.
19/12/04 11:48:17 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 11:48:17 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38275.
19/12/04 11:48:17 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 11:48:17 INFO data_plane.create_data_channel: Creating client data channel for localhost:42971
19/12/04 11:48:17 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 11:48:17 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 11:48:17 INFO sdk_worker.run: No more requests from control plane
19/12/04 11:48:17 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 11:48:17 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 11:48:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:48:17 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 11:48:17 INFO sdk_worker.run: Done consuming work.
19/12/04 11:48:17 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 11:48:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 11:48:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:48:18 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575460091.98_e565b90c-7d30-49fe-82ab-72eaf5d26b41 finished.
19/12/04 11:48:18 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/04 11:48:18 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_1b31baff-346a-4108-8033-b47e51230def","basePath":"/tmp/sparktestgqYk9O"}: {}
java.io.FileNotFoundException: /tmp/sparktestgqYk9O/job_1b31baff-346a-4108-8033-b47e51230def/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
==================== Timed out after 60 seconds. ====================

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 139858077013760)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-117, started daemon 139858085406464)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 139859215615744)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 139858060228352)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)

# Thread: <Thread(Thread-123, started daemon 139858068621056)>

# Thread: <_MainThread(MainThread, started 139859215615744)>

# Thread: <Thread(Thread-117, started daemon 139858085406464)>

# Thread: <Thread(wait_until_finish_read, started daemon 139858077013760)>
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575460080.94_2aadb51c-66d6-48a7-ad23-a04f4fbab691 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 336.104s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 15s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://scans.gradle.com/s/ryigf7fbujw3o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1687

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1687/display/redirect?page=changes>

Changes:

[michal.walenia] [BEAM-8869] Exclude system metrics test from legacy runner test suite


------------------------------------------
[...truncated 1.32 MB...]
19/12/04 11:32:34 INFO sdk_worker_main.start: Status HTTP server running at localhost:37931
19/12/04 11:32:34 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 11:32:34 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 11:32:34 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575459151.68_54d713d7-34cd-4b79-aea8-b202a6506e79', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 11:32:34 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575459151.68', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53997', 'job_port': u'0'}
19/12/04 11:32:34 INFO statecache.__init__: Creating state cache with size 0
19/12/04 11:32:34 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42157.
19/12/04 11:32:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/04 11:32:34 INFO sdk_worker.__init__: Control channel established.
19/12/04 11:32:34 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 11:32:34 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40603.
19/12/04 11:32:34 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 11:32:34 INFO data_plane.create_data_channel: Creating client data channel for localhost:38579
19/12/04 11:32:34 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 11:32:34 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 11:32:34 INFO sdk_worker.run: No more requests from control plane
19/12/04 11:32:34 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 11:32:35 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 11:32:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:32:35 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 11:32:35 INFO sdk_worker.run: Done consuming work.
19/12/04 11:32:35 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 11:32:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 11:32:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:32:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 11:32:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 11:32:36 INFO sdk_worker_main.main: Logging handler created.
19/12/04 11:32:36 INFO sdk_worker_main.start: Status HTTP server running at localhost:46191
19/12/04 11:32:36 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 11:32:36 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 11:32:36 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575459151.68_54d713d7-34cd-4b79-aea8-b202a6506e79', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 11:32:36 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575459151.68', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53997', 'job_port': u'0'}
19/12/04 11:32:36 INFO statecache.__init__: Creating state cache with size 0
19/12/04 11:32:36 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41695.
19/12/04 11:32:36 INFO sdk_worker.__init__: Control channel established.
19/12/04 11:32:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/04 11:32:36 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 11:32:36 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44641.
19/12/04 11:32:36 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 11:32:36 INFO data_plane.create_data_channel: Creating client data channel for localhost:33221
19/12/04 11:32:36 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 11:32:36 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 11:32:36 INFO sdk_worker.run: No more requests from control plane
19/12/04 11:32:36 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 11:32:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:32:36 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 11:32:36 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 11:32:36 INFO sdk_worker.run: Done consuming work.
19/12/04 11:32:36 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 11:32:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 11:32:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:32:36 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 11:32:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 11:32:37 INFO sdk_worker_main.main: Logging handler created.
19/12/04 11:32:37 INFO sdk_worker_main.start: Status HTTP server running at localhost:34587
19/12/04 11:32:37 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 11:32:37 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 11:32:37 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575459151.68_54d713d7-34cd-4b79-aea8-b202a6506e79', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 11:32:37 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575459151.68', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53997', 'job_port': u'0'}
19/12/04 11:32:37 INFO statecache.__init__: Creating state cache with size 0
19/12/04 11:32:37 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38809.
19/12/04 11:32:37 INFO sdk_worker.__init__: Control channel established.
19/12/04 11:32:37 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 11:32:37 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/04 11:32:37 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45645.
19/12/04 11:32:37 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 11:32:37 INFO data_plane.create_data_channel: Creating client data channel for localhost:43051
19/12/04 11:32:37 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 11:32:37 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 11:32:37 INFO sdk_worker.run: No more requests from control plane
19/12/04 11:32:37 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 11:32:37 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 11:32:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:32:37 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 11:32:37 INFO sdk_worker.run: Done consuming work.
19/12/04 11:32:37 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 11:32:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 11:32:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:32:37 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 11:32:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 11:32:38 INFO sdk_worker_main.main: Logging handler created.
19/12/04 11:32:38 INFO sdk_worker_main.start: Status HTTP server running at localhost:32871
19/12/04 11:32:38 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 11:32:38 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 11:32:38 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575459151.68_54d713d7-34cd-4b79-aea8-b202a6506e79', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 11:32:38 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575459151.68', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53997', 'job_port': u'0'}
19/12/04 11:32:38 INFO statecache.__init__: Creating state cache with size 0
19/12/04 11:32:38 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40209.
19/12/04 11:32:38 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/04 11:32:38 INFO sdk_worker.__init__: Control channel established.
19/12/04 11:32:38 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 11:32:38 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34555.
19/12/04 11:32:38 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 11:32:38 INFO data_plane.create_data_channel: Creating client data channel for localhost:42019
19/12/04 11:32:38 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 11:32:38 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 11:32:38 INFO sdk_worker.run: No more requests from control plane
19/12/04 11:32:38 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 11:32:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:32:38 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 11:32:38 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 11:32:38 INFO sdk_worker.run: Done consuming work.
19/12/04 11:32:38 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 11:32:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 11:32:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:32:38 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575459151.68_54d713d7-34cd-4b79-aea8-b202a6506e79 finished.
19/12/04 11:32:38 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/04 11:32:38 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_f099d220-a764-45ad-afe6-50a69ed4ffcb","basePath":"/tmp/sparktestMabO58"}: {}
java.io.FileNotFoundException: /tmp/sparktestMabO58/job_f099d220-a764-45ad-afe6-50a69ed4ffcb/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140525282371328)>

# Thread: <Thread(Thread-119, started daemon 140525273978624)>

# Thread: <_MainThread(MainThread, started 140526069442304)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140525255096064)>

    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-125, started daemon 140525263750912)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(Thread-119, started daemon 140525273978624)>

# Thread: <Thread(wait_until_finish_read, started daemon 140525282371328)>

# Thread: <_MainThread(MainThread, started 140526069442304)>
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575459140.28_8bdce242-9f94-4309-bd88-c90a9ab91b2a failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 370.820s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 20s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://scans.gradle.com/s/gsnibvhy74fqo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1686

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1686/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-8883] downgrade 'Failed to remove job staging directory' log level


------------------------------------------
[...truncated 1.32 MB...]
19/12/04 09:29:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:41711
19/12/04 09:29:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 09:29:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 09:29:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575451777.47_e6b34f60-9214-4951-9c9a-b6fc3b11ffca', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 09:29:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575451777.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60061', 'job_port': u'0'}
19/12/04 09:29:40 INFO statecache.__init__: Creating state cache with size 0
19/12/04 09:29:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39997.
19/12/04 09:29:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/04 09:29:40 INFO sdk_worker.__init__: Control channel established.
19/12/04 09:29:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 09:29:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36025.
19/12/04 09:29:40 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 09:29:40 INFO data_plane.create_data_channel: Creating client data channel for localhost:43165
19/12/04 09:29:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 09:29:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 09:29:40 INFO sdk_worker.run: No more requests from control plane
19/12/04 09:29:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 09:29:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 09:29:40 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 09:29:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 09:29:40 INFO sdk_worker.run: Done consuming work.
19/12/04 09:29:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 09:29:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 09:29:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 09:29:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 09:29:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 09:29:40 INFO sdk_worker_main.main: Logging handler created.
19/12/04 09:29:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:41625
19/12/04 09:29:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 09:29:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 09:29:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575451777.47_e6b34f60-9214-4951-9c9a-b6fc3b11ffca', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 09:29:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575451777.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60061', 'job_port': u'0'}
19/12/04 09:29:40 INFO statecache.__init__: Creating state cache with size 0
19/12/04 09:29:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40019.
19/12/04 09:29:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/04 09:29:40 INFO sdk_worker.__init__: Control channel established.
19/12/04 09:29:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 09:29:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36075.
19/12/04 09:29:40 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 09:29:40 INFO data_plane.create_data_channel: Creating client data channel for localhost:44647
19/12/04 09:29:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 09:29:41 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 09:29:41 INFO sdk_worker.run: No more requests from control plane
19/12/04 09:29:41 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 09:29:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 09:29:41 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 09:29:41 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 09:29:41 INFO sdk_worker.run: Done consuming work.
19/12/04 09:29:41 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 09:29:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 09:29:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 09:29:41 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 09:29:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 09:29:41 INFO sdk_worker_main.main: Logging handler created.
19/12/04 09:29:41 INFO sdk_worker_main.start: Status HTTP server running at localhost:33333
19/12/04 09:29:41 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 09:29:41 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 09:29:41 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575451777.47_e6b34f60-9214-4951-9c9a-b6fc3b11ffca', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 09:29:41 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575451777.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60061', 'job_port': u'0'}
19/12/04 09:29:41 INFO statecache.__init__: Creating state cache with size 0
19/12/04 09:29:41 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34073.
19/12/04 09:29:41 INFO sdk_worker.__init__: Control channel established.
19/12/04 09:29:41 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 09:29:41 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/04 09:29:41 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36519.
19/12/04 09:29:41 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 09:29:41 INFO data_plane.create_data_channel: Creating client data channel for localhost:42851
19/12/04 09:29:41 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 09:29:41 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 09:29:41 INFO sdk_worker.run: No more requests from control plane
19/12/04 09:29:41 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 09:29:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 09:29:41 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 09:29:41 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 09:29:41 INFO sdk_worker.run: Done consuming work.
19/12/04 09:29:41 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 09:29:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 09:29:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 09:29:42 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 09:29:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 09:29:42 INFO sdk_worker_main.main: Logging handler created.
19/12/04 09:29:42 INFO sdk_worker_main.start: Status HTTP server running at localhost:41501
19/12/04 09:29:42 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 09:29:42 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 09:29:42 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575451777.47_e6b34f60-9214-4951-9c9a-b6fc3b11ffca', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 09:29:42 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575451777.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60061', 'job_port': u'0'}
19/12/04 09:29:42 INFO statecache.__init__: Creating state cache with size 0
19/12/04 09:29:42 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40855.
19/12/04 09:29:42 INFO sdk_worker.__init__: Control channel established.
19/12/04 09:29:42 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 09:29:42 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/04 09:29:42 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33813.
19/12/04 09:29:42 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 09:29:42 INFO data_plane.create_data_channel: Creating client data channel for localhost:40913
19/12/04 09:29:42 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 09:29:42 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 09:29:42 INFO sdk_worker.run: No more requests from control plane
19/12/04 09:29:42 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 09:29:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 09:29:42 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 09:29:42 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 09:29:42 INFO sdk_worker.run: Done consuming work.
19/12/04 09:29:42 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 09:29:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 09:29:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 09:29:42 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575451777.47_e6b34f60-9214-4951-9c9a-b6fc3b11ffca finished.
19/12/04 09:29:42 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/04 09:29:42 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_8785425c-a7e2-4aed-91a8-25eafc25ba56","basePath":"/tmp/sparktestiHzXMo"}: {}
java.io.FileNotFoundException: /tmp/sparktestiHzXMo/job_8785425c-a7e2-4aed-91a8-25eafc25ba56/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

==================== Timed out after 60 seconds. ====================

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140666172253952)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-119, started daemon 140666163861248)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140666951993088)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140665673475840)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(Thread-125, started daemon 140665665083136)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
# Thread: <_MainThread(MainThread, started 140666951993088)>

BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-119, started daemon 140666163861248)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apach# Thread: <Thread(wait_until_finish_read, started daemon 140666172253952)>
e_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575451767.93_cf7097d7-eda4-4a7b-b4f8-cb30c67f4d03 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 322.151s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 58s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/gmgru5yqv2me4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1685

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1685/display/redirect>

Changes:


------------------------------------------
[...truncated 1.33 MB...]

19/12/04 06:52:25 INFO sdk_worker.run: No more requests from control plane
19/12/04 06:52:25 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 06:52:25 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 06:52:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 06:52:25 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 06:52:25 INFO sdk_worker.run: Done consuming work.
19/12/04 06:52:25 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 06:52:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 06:52:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 06:52:26 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 06:52:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 06:52:26 INFO sdk_worker_main.main: Logging handler created.
19/12/04 06:52:26 INFO sdk_worker_main.start: Status HTTP server running at localhost:41731
19/12/04 06:52:26 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 06:52:26 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 06:52:26 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575442343.27_79e7d77d-f27a-4083-ab3c-d43dbbe2327d', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 06:52:26 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575442343.27', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40217', 'job_port': u'0'}
19/12/04 06:52:26 INFO statecache.__init__: Creating state cache with size 0
19/12/04 06:52:26 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45521.
19/12/04 06:52:26 INFO sdk_worker.__init__: Control channel established.
19/12/04 06:52:26 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 06:52:26 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/04 06:52:26 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45179.
19/12/04 06:52:26 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 06:52:26 INFO data_plane.create_data_channel: Creating client data channel for localhost:40911
19/12/04 06:52:26 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 06:52:26 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 06:52:26 INFO sdk_worker.run: No more requests from control plane
19/12/04 06:52:26 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 06:52:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 06:52:26 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 06:52:26 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 06:52:26 INFO sdk_worker.run: Done consuming work.
19/12/04 06:52:26 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 06:52:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 06:52:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 06:52:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 06:52:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 06:52:27 INFO sdk_worker_main.main: Logging handler created.
19/12/04 06:52:27 INFO sdk_worker_main.start: Status HTTP server running at localhost:39183
19/12/04 06:52:27 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 06:52:27 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 06:52:27 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575442343.27_79e7d77d-f27a-4083-ab3c-d43dbbe2327d', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 06:52:27 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575442343.27', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40217', 'job_port': u'0'}
19/12/04 06:52:27 INFO statecache.__init__: Creating state cache with size 0
19/12/04 06:52:27 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37041.
19/12/04 06:52:27 INFO sdk_worker.__init__: Control channel established.
19/12/04 06:52:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/04 06:52:27 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 06:52:27 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39987.
19/12/04 06:52:27 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 06:52:27 INFO data_plane.create_data_channel: Creating client data channel for localhost:44103
19/12/04 06:52:27 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 06:52:27 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 06:52:27 INFO sdk_worker.run: No more requests from control plane
19/12/04 06:52:27 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 06:52:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 06:52:27 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 06:52:27 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 06:52:27 INFO sdk_worker.run: Done consuming work.
19/12/04 06:52:27 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 06:52:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 06:52:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 06:52:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 06:52:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 06:52:28 INFO sdk_worker_main.main: Logging handler created.
19/12/04 06:52:28 INFO sdk_worker_main.start: Status HTTP server running at localhost:46015
19/12/04 06:52:28 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 06:52:28 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 06:52:28 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575442343.27_79e7d77d-f27a-4083-ab3c-d43dbbe2327d', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 06:52:28 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575442343.27', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40217', 'job_port': u'0'}
19/12/04 06:52:28 INFO statecache.__init__: Creating state cache with size 0
19/12/04 06:52:28 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39091.
19/12/04 06:52:28 INFO sdk_worker.__init__: Control channel established.
19/12/04 06:52:28 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/04 06:52:28 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 06:52:28 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41299.
19/12/04 06:52:28 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 06:52:28 INFO data_plane.create_data_channel: Creating client data channel for localhost:46159
19/12/04 06:52:28 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 06:52:28 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 06:52:28 INFO sdk_worker.run: No more requests from control plane
19/12/04 06:52:28 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 06:52:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 06:52:28 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 06:52:28 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 06:52:28 INFO sdk_worker.run: Done consuming work.
19/12/04 06:52:28 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 06:52:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 06:52:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 06:52:28 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575442343.27_79e7d77d-f27a-4083-ab3c-d43dbbe2327d finished.
19/12/04 06:52:28 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/04 06:52:28 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_bb58eeaa-3c93-4deb-9b1f-b6b968d0f22d","basePath":"/tmp/sparktestDp1ByL"}: {}
java.io.FileNotFoundException: /tmp/sparktestDp1ByL/job_bb58eeaa-3c93-4deb-9b1f-b6b968d0f22d/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
==================== Timed out after 60 seconds. ====================

    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
# Thread: <Thread(wait_until_finish_read, started daemon 139752301029120)>

    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-119, started daemon 139752292636416)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 139753088661248)>
==================== Timed out after 60 seconds. ====================

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 139752274278144)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(lis# Thread: <Thread(Thread-125, started daemon 139752282932992)>

t(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <_MainThread(MainThread, started 139753088661248)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575442334.21_81cdc114-5806-4fc0-b59d-23b2199121a0 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <Thread(Thread-119, started daemon 139752292636416)>

# Thread: <Thread(wait_until_finish_read, started daemon 139752301029120)>
----------------------------------------------------------------------
Ran 38 tests in 318.360s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 5s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...

Publishing failed.

The response from https://scans-in.gradle.com/in/5.2.1/2.3 was not from the build scan server.
Your network environment may be interfering, or the service may be unavailable.

If you believe this to be in error, please report this problem via https://gradle.com/scans/help/plugin and include the following via copy/paste:

----------
Gradle version: 5.2.1
Plugin version: 2.3
Request URL: https://scans-in.gradle.com/in/5.2.1/2.3
Request ID: a1a7bad1-415c-4610-862c-59b434734332
Response status code: 502
Response content type: text/html; charset=UTF-8
Response server type: cloudflare
----------

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1684

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1684/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/04 04:58:28 INFO sdk_worker_main.start: Status HTTP server running at localhost:33121
19/12/04 04:58:28 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 04:58:28 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 04:58:28 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575435506.04_1a77ed46-84e8-4661-8b1d-2474e24a29cf', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 04:58:28 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575435506.04', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54053', 'job_port': u'0'}
19/12/04 04:58:28 INFO statecache.__init__: Creating state cache with size 0
19/12/04 04:58:28 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35433.
19/12/04 04:58:28 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/04 04:58:28 INFO sdk_worker.__init__: Control channel established.
19/12/04 04:58:28 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 04:58:28 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34081.
19/12/04 04:58:28 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 04:58:28 INFO data_plane.create_data_channel: Creating client data channel for localhost:45463
19/12/04 04:58:28 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 04:58:28 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 04:58:28 INFO sdk_worker.run: No more requests from control plane
19/12/04 04:58:28 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 04:58:28 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 04:58:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 04:58:28 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 04:58:28 INFO sdk_worker.run: Done consuming work.
19/12/04 04:58:28 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 04:58:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 04:58:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 04:58:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 04:58:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 04:58:29 INFO sdk_worker_main.main: Logging handler created.
19/12/04 04:58:29 INFO sdk_worker_main.start: Status HTTP server running at localhost:34309
19/12/04 04:58:29 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 04:58:29 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 04:58:29 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575435506.04_1a77ed46-84e8-4661-8b1d-2474e24a29cf', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 04:58:29 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575435506.04', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54053', 'job_port': u'0'}
19/12/04 04:58:29 INFO statecache.__init__: Creating state cache with size 0
19/12/04 04:58:29 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46679.
19/12/04 04:58:29 INFO sdk_worker.__init__: Control channel established.
19/12/04 04:58:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/04 04:58:29 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 04:58:29 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46499.
19/12/04 04:58:29 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 04:58:29 INFO data_plane.create_data_channel: Creating client data channel for localhost:45663
19/12/04 04:58:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 04:58:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 04:58:29 INFO sdk_worker.run: No more requests from control plane
19/12/04 04:58:29 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 04:58:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 04:58:29 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 04:58:29 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 04:58:29 INFO sdk_worker.run: Done consuming work.
19/12/04 04:58:29 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 04:58:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 04:58:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 04:58:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 04:58:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 04:58:30 INFO sdk_worker_main.main: Logging handler created.
19/12/04 04:58:30 INFO sdk_worker_main.start: Status HTTP server running at localhost:34819
19/12/04 04:58:30 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 04:58:30 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 04:58:30 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575435506.04_1a77ed46-84e8-4661-8b1d-2474e24a29cf', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 04:58:30 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575435506.04', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54053', 'job_port': u'0'}
19/12/04 04:58:30 INFO statecache.__init__: Creating state cache with size 0
19/12/04 04:58:30 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35257.
19/12/04 04:58:30 INFO sdk_worker.__init__: Control channel established.
19/12/04 04:58:30 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 04:58:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/04 04:58:30 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37191.
19/12/04 04:58:30 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 04:58:30 INFO data_plane.create_data_channel: Creating client data channel for localhost:36143
19/12/04 04:58:30 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 04:58:30 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 04:58:30 INFO sdk_worker.run: No more requests from control plane
19/12/04 04:58:30 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 04:58:30 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 04:58:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 04:58:30 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 04:58:30 INFO sdk_worker.run: Done consuming work.
19/12/04 04:58:30 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 04:58:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 04:58:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 04:58:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 04:58:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 04:58:31 INFO sdk_worker_main.main: Logging handler created.
19/12/04 04:58:31 INFO sdk_worker_main.start: Status HTTP server running at localhost:43261
19/12/04 04:58:31 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 04:58:31 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 04:58:31 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575435506.04_1a77ed46-84e8-4661-8b1d-2474e24a29cf', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 04:58:31 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575435506.04', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54053', 'job_port': u'0'}
19/12/04 04:58:31 INFO statecache.__init__: Creating state cache with size 0
19/12/04 04:58:31 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36175.
19/12/04 04:58:31 INFO sdk_worker.__init__: Control channel established.
19/12/04 04:58:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/04 04:58:31 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 04:58:31 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36539.
19/12/04 04:58:31 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 04:58:31 INFO data_plane.create_data_channel: Creating client data channel for localhost:46839
19/12/04 04:58:31 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 04:58:31 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 04:58:31 INFO sdk_worker.run: No more requests from control plane
19/12/04 04:58:31 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 04:58:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 04:58:31 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 04:58:31 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 04:58:31 INFO sdk_worker.run: Done consuming work.
19/12/04 04:58:31 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 04:58:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 04:58:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 04:58:31 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575435506.04_1a77ed46-84e8-4661-8b1d-2474e24a29cf finished.
19/12/04 04:58:31 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/04 04:58:31 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_d8a7a09d-4a6c-40b2-a70c-0af5cf17f268","basePath":"/tmp/sparktestdgvhvE"}: {}
java.io.FileNotFoundException: /tmp/sparktestdgvhvE/job_d8a7a09d-4a6c-40b2-a70c-0af5cf17f268/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
==================== Timed out after 60 seconds. ====================
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()

# Thread: <Thread(wait_until_finish_read, started daemon 140241098508032)>

# Thread: <Thread(Thread-119, started daemon 140241106900736)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 140241895032576)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 140240614323968)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-125, started daemon 140240605931264)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-119, started daemon 140241106900736)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140241895032576)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140241098508032)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575435496.4_09206508-b8cf-48f8-9ca0-59bc5bda2f98 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 313.066s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 2s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/eymadeyilpyms

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1683

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1683/display/redirect?page=changes>

Changes:

[ehudm] [BEAM-8489] Filter: don't use callable's output type

[lostluck] [GoSDK] Handle data write errors & stream recreate

[github] [BEAM-8835] Disable Flink Uber Jar by default. (#10270)

[lostluck] [GoSDK] Cancel stream context on dataWriter error

[github] [BEAM-8651] [BEAM-8874] Change pickle_lock to be a reentrant lock, and

[lostluck] [GoSDK] Don't panic if debug symbols are striped

[lcwik] [BEAM-8523] Regenerate Go protos with respect to changes in #9959


------------------------------------------
[...truncated 1.32 MB...]
19/12/04 01:40:08 INFO sdk_worker_main.start: Status HTTP server running at localhost:35573
19/12/04 01:40:08 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 01:40:08 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 01:40:08 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575423603.25_e9c31a43-4921-421e-b4c0-3413b1f8578c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 01:40:08 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575423603.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:47731', 'job_port': u'0'}
19/12/04 01:40:08 INFO statecache.__init__: Creating state cache with size 0
19/12/04 01:40:08 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39001.
19/12/04 01:40:08 INFO sdk_worker.__init__: Control channel established.
19/12/04 01:40:08 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 01:40:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/04 01:40:08 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38877.
19/12/04 01:40:08 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 01:40:08 INFO data_plane.create_data_channel: Creating client data channel for localhost:40891
19/12/04 01:40:08 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 01:40:08 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 01:40:08 INFO sdk_worker.run: No more requests from control plane
19/12/04 01:40:08 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 01:40:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 01:40:08 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 01:40:08 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 01:40:08 INFO sdk_worker.run: Done consuming work.
19/12/04 01:40:08 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 01:40:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 01:40:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 01:40:08 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 01:40:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 01:40:09 INFO sdk_worker_main.main: Logging handler created.
19/12/04 01:40:09 INFO sdk_worker_main.start: Status HTTP server running at localhost:46009
19/12/04 01:40:09 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 01:40:09 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 01:40:09 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575423603.25_e9c31a43-4921-421e-b4c0-3413b1f8578c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 01:40:09 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575423603.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:47731', 'job_port': u'0'}
19/12/04 01:40:09 INFO statecache.__init__: Creating state cache with size 0
19/12/04 01:40:09 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33879.
19/12/04 01:40:09 INFO sdk_worker.__init__: Control channel established.
19/12/04 01:40:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/04 01:40:09 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 01:40:09 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34221.
19/12/04 01:40:09 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 01:40:09 INFO data_plane.create_data_channel: Creating client data channel for localhost:35113
19/12/04 01:40:09 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 01:40:09 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 01:40:09 INFO sdk_worker.run: No more requests from control plane
19/12/04 01:40:09 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 01:40:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 01:40:09 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 01:40:09 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 01:40:09 INFO sdk_worker.run: Done consuming work.
19/12/04 01:40:09 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 01:40:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 01:40:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 01:40:09 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 01:40:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 01:40:10 INFO sdk_worker_main.main: Logging handler created.
19/12/04 01:40:10 INFO sdk_worker_main.start: Status HTTP server running at localhost:32899
19/12/04 01:40:10 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 01:40:10 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 01:40:10 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575423603.25_e9c31a43-4921-421e-b4c0-3413b1f8578c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 01:40:10 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575423603.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:47731', 'job_port': u'0'}
19/12/04 01:40:10 INFO statecache.__init__: Creating state cache with size 0
19/12/04 01:40:10 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39949.
19/12/04 01:40:10 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/04 01:40:10 INFO sdk_worker.__init__: Control channel established.
19/12/04 01:40:10 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 01:40:10 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40095.
19/12/04 01:40:10 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 01:40:10 INFO data_plane.create_data_channel: Creating client data channel for localhost:35075
19/12/04 01:40:10 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 01:40:10 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 01:40:10 INFO sdk_worker.run: No more requests from control plane
19/12/04 01:40:10 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 01:40:10 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 01:40:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 01:40:10 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 01:40:10 INFO sdk_worker.run: Done consuming work.
19/12/04 01:40:10 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 01:40:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 01:40:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 01:40:10 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 01:40:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 01:40:11 INFO sdk_worker_main.main: Logging handler created.
19/12/04 01:40:11 INFO sdk_worker_main.start: Status HTTP server running at localhost:35985
19/12/04 01:40:11 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 01:40:11 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 01:40:11 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575423603.25_e9c31a43-4921-421e-b4c0-3413b1f8578c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 01:40:11 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575423603.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:47731', 'job_port': u'0'}
19/12/04 01:40:11 INFO statecache.__init__: Creating state cache with size 0
19/12/04 01:40:11 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39537.
19/12/04 01:40:11 INFO sdk_worker.__init__: Control channel established.
19/12/04 01:40:11 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/04 01:40:11 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 01:40:11 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35983.
19/12/04 01:40:11 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 01:40:11 INFO data_plane.create_data_channel: Creating client data channel for localhost:45017
19/12/04 01:40:11 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 01:40:11 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 01:40:11 INFO sdk_worker.run: No more requests from control plane
19/12/04 01:40:11 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 01:40:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 01:40:11 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 01:40:11 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 01:40:11 INFO sdk_worker.run: Done consuming work.
19/12/04 01:40:11 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 01:40:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 01:40:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 01:40:11 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575423603.25_e9c31a43-4921-421e-b4c0-3413b1f8578c finished.
19/12/04 01:40:11 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/04 01:40:11 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_c80dd2ea-4386-48b6-8a25-4f701311bb19","basePath":"/tmp/sparktestm1A3Zb"}: {}
java.io.FileNotFoundException: /tmp/sparktestm1A3Zb/job_c80dd2ea-4386-48b6-8a25-4f701311bb19/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
==================== Timed out after 60 seconds. ====================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)

----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140031056205568)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
# Thread: <Thread(Thread-120, started daemon 140031047812864)>

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <_MainThread(MainThread, started 140031835944704)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140030949447424)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
# Thread: <Thread(Thread-126, started daemon 140030957840128)>

# Thread: <Thread(Thread-120, started daemon 140031047812864)>

    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 140031835944704)>

# Thread: <Thread(wait_until_finish_read, started daemon 140031056205568)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575423581.94_481a58c0-479e-49d5-ad31-857ba4bf39f8 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 358.403s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 39s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://scans.gradle.com/s/usp7qwxak4whw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1682

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1682/display/redirect?page=changes>

Changes:

[rohde.samuel] change definition of has_unbounded_sources in PIN to a pre-determined

[rohde.samuel] typo

[rohde.samuel] lint

[rohde.samuel] remove BigQueryReader from list

[rohde.samuel] lint

[rohde.samuel] remove external

[rohde.samuel] remove external

[github] Merge pull request #10248: [BEAM-7274] Add type conversions factory

[chamikara] Merge pull request #10262: [BEAM-8575] Revert validates runner test tag


------------------------------------------
[...truncated 1.31 MB...]
19/12/03 21:51:57 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575409916.95_17d55ea6-f15e-4fb4-921d-c3ebbaeb60c6 on Spark master local
19/12/03 21:51:57 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/03 21:51:57 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575409916.95_17d55ea6-f15e-4fb4-921d-c3ebbaeb60c6: Pipeline translated successfully. Computing outputs
19/12/03 21:51:58 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 21:51:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 21:51:58 INFO sdk_worker_main.main: Logging handler created.
19/12/03 21:51:58 INFO sdk_worker_main.start: Status HTTP server running at localhost:42641
19/12/03 21:51:58 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 21:51:58 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 21:51:58 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575409916.95_17d55ea6-f15e-4fb4-921d-c3ebbaeb60c6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 21:51:58 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575409916.95', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53383', 'job_port': u'0'}
19/12/03 21:51:58 INFO statecache.__init__: Creating state cache with size 0
19/12/03 21:51:58 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36467.
19/12/03 21:51:58 INFO sdk_worker.__init__: Control channel established.
19/12/03 21:51:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/03 21:51:58 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 21:51:58 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41631.
19/12/03 21:51:58 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 21:51:58 INFO data_plane.create_data_channel: Creating client data channel for localhost:43143
19/12/03 21:51:58 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 21:51:58 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 21:51:58 INFO sdk_worker.run: No more requests from control plane
19/12/03 21:51:58 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 21:51:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 21:51:58 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 21:51:58 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 21:51:58 INFO sdk_worker.run: Done consuming work.
19/12/03 21:51:58 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 21:51:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 21:51:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 21:51:58 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 21:51:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 21:51:59 INFO sdk_worker_main.main: Logging handler created.
19/12/03 21:51:59 INFO sdk_worker_main.start: Status HTTP server running at localhost:45779
19/12/03 21:51:59 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 21:51:59 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 21:51:59 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575409916.95_17d55ea6-f15e-4fb4-921d-c3ebbaeb60c6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 21:51:59 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575409916.95', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53383', 'job_port': u'0'}
19/12/03 21:51:59 INFO statecache.__init__: Creating state cache with size 0
19/12/03 21:51:59 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34965.
19/12/03 21:51:59 INFO sdk_worker.__init__: Control channel established.
19/12/03 21:51:59 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/03 21:51:59 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 21:51:59 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38899.
19/12/03 21:51:59 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 21:51:59 INFO data_plane.create_data_channel: Creating client data channel for localhost:46697
19/12/03 21:51:59 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 21:51:59 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 21:51:59 INFO sdk_worker.run: No more requests from control plane
19/12/03 21:51:59 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 21:51:59 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 21:51:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 21:51:59 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 21:51:59 INFO sdk_worker.run: Done consuming work.
19/12/03 21:51:59 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 21:51:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 21:51:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 21:51:59 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 21:52:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 21:52:00 INFO sdk_worker_main.main: Logging handler created.
19/12/03 21:52:00 INFO sdk_worker_main.start: Status HTTP server running at localhost:44921
19/12/03 21:52:00 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 21:52:00 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 21:52:00 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575409916.95_17d55ea6-f15e-4fb4-921d-c3ebbaeb60c6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 21:52:00 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575409916.95', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53383', 'job_port': u'0'}
19/12/03 21:52:00 INFO statecache.__init__: Creating state cache with size 0
19/12/03 21:52:00 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43533.
19/12/03 21:52:00 INFO sdk_worker.__init__: Control channel established.
19/12/03 21:52:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/03 21:52:00 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 21:52:00 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46109.
19/12/03 21:52:00 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 21:52:00 INFO data_plane.create_data_channel: Creating client data channel for localhost:34391
19/12/03 21:52:00 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 21:52:00 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 21:52:00 INFO sdk_worker.run: No more requests from control plane
19/12/03 21:52:00 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 21:52:00 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 21:52:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 21:52:00 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 21:52:00 INFO sdk_worker.run: Done consuming work.
19/12/03 21:52:00 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 21:52:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 21:52:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 21:52:00 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 21:52:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 21:52:01 INFO sdk_worker_main.main: Logging handler created.
19/12/03 21:52:01 INFO sdk_worker_main.start: Status HTTP server running at localhost:34535
19/12/03 21:52:01 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 21:52:01 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 21:52:01 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575409916.95_17d55ea6-f15e-4fb4-921d-c3ebbaeb60c6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 21:52:01 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575409916.95', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53383', 'job_port': u'0'}
19/12/03 21:52:01 INFO statecache.__init__: Creating state cache with size 0
19/12/03 21:52:01 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38479.
19/12/03 21:52:01 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/03 21:52:01 INFO sdk_worker.__init__: Control channel established.
19/12/03 21:52:01 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 21:52:01 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44381.
19/12/03 21:52:01 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 21:52:01 INFO data_plane.create_data_channel: Creating client data channel for localhost:35977
19/12/03 21:52:01 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 21:52:01 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 21:52:01 INFO sdk_worker.run: No more requests from control plane
19/12/03 21:52:01 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 21:52:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 21:52:01 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 21:52:01 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 21:52:01 INFO sdk_worker.run: Done consuming work.
19/12/03 21:52:01 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 21:52:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 21:52:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 21:52:01 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 21:52:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 21:52:02 INFO sdk_worker_main.main: Logging handler created.
19/12/03 21:52:02 INFO sdk_worker_main.start: Status HTTP server running at localhost:37107
19/12/03 21:52:02 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 21:52:02 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 21:52:02 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575409916.95_17d55ea6-f15e-4fb4-921d-c3ebbaeb60c6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 21:52:02 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575409916.95', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53383', 'job_port': u'0'}
19/12/03 21:52:02 INFO statecache.__init__: Creating state cache with size 0
19/12/03 21:52:02 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41631.
19/12/03 21:52:02 INFO sdk_worker.__init__: Control channel established.
19/12/03 21:52:02 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 21:52:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/03 21:52:02 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45533.
19/12/03 21:52:02 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 21:52:02 INFO data_plane.create_data_channel: Creating client data channel for localhost:36265
19/12/03 21:52:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 21:52:02 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 21:52:02 INFO sdk_worker.run: No more requests from control plane
19/12/03 21:52:02 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 21:52:02 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 21:52:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 21:52:02 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 21:52:02 INFO sdk_worker.run: Done consuming work.
19/12/03 21:52:02 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 21:52:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 21:52:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 21:52:02 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575409916.95_17d55ea6-f15e-4fb4-921d-c3ebbaeb60c6 finished.
19/12/03 21:52:02 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/03 21:52:02 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_5b6fcb0f-ac5f-4898-b636-f5b177eeabc5","basePath":"/tmp/sparktest7B3eMY"}: {}
java.io.FileNotFoundException: /tmp/sparktest7B3eMY/job_5b6fcb0f-ac5f-4898-b636-f5b177eeabc5/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
==================== Timed out after 60 seconds. ====================

    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 140239455958784)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-120, started daemon 140239447566080)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
# Thread: <_MainThread(MainThread, started 140240235407104)>
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575409907.9_370c1377-80e9-4954-b0b6-c5e573567446 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 293.134s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 38s
60 actionable tasks: 56 executed, 4 from cache

Publishing build scan...
https://scans.gradle.com/s/5mivsj5vsyqac

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1681

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1681/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-8251] plumb worker_(region|zone) to Environment proto

[kcweaver] Add null checks for worker region/zone options


------------------------------------------
[...truncated 1.32 MB...]
19/12/03 19:01:18 INFO sdk_worker_main.start: Status HTTP server running at localhost:41195
19/12/03 19:01:18 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 19:01:18 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 19:01:18 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575399675.95_649faa8c-3a04-4926-a03b-27dd2c3e5f7c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 19:01:18 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575399675.95', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50261', 'job_port': u'0'}
19/12/03 19:01:18 INFO statecache.__init__: Creating state cache with size 0
19/12/03 19:01:18 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40043.
19/12/03 19:01:18 INFO sdk_worker.__init__: Control channel established.
19/12/03 19:01:18 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/03 19:01:18 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 19:01:18 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36633.
19/12/03 19:01:18 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 19:01:18 INFO data_plane.create_data_channel: Creating client data channel for localhost:43169
19/12/03 19:01:18 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 19:01:18 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 19:01:18 INFO sdk_worker.run: No more requests from control plane
19/12/03 19:01:18 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 19:01:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 19:01:18 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 19:01:18 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 19:01:18 INFO sdk_worker.run: Done consuming work.
19/12/03 19:01:18 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 19:01:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 19:01:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 19:01:18 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 19:01:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 19:01:19 INFO sdk_worker_main.main: Logging handler created.
19/12/03 19:01:19 INFO sdk_worker_main.start: Status HTTP server running at localhost:33569
19/12/03 19:01:19 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 19:01:19 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 19:01:19 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575399675.95_649faa8c-3a04-4926-a03b-27dd2c3e5f7c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 19:01:19 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575399675.95', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50261', 'job_port': u'0'}
19/12/03 19:01:19 INFO statecache.__init__: Creating state cache with size 0
19/12/03 19:01:19 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36449.
19/12/03 19:01:19 INFO sdk_worker.__init__: Control channel established.
19/12/03 19:01:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/03 19:01:19 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 19:01:19 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45139.
19/12/03 19:01:19 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 19:01:19 INFO data_plane.create_data_channel: Creating client data channel for localhost:35613
19/12/03 19:01:19 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 19:01:19 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 19:01:19 INFO sdk_worker.run: No more requests from control plane
19/12/03 19:01:19 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 19:01:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 19:01:19 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 19:01:19 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 19:01:19 INFO sdk_worker.run: Done consuming work.
19/12/03 19:01:19 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 19:01:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 19:01:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 19:01:19 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 19:01:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 19:01:20 INFO sdk_worker_main.main: Logging handler created.
19/12/03 19:01:20 INFO sdk_worker_main.start: Status HTTP server running at localhost:40891
19/12/03 19:01:20 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 19:01:20 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 19:01:20 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575399675.95_649faa8c-3a04-4926-a03b-27dd2c3e5f7c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 19:01:20 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575399675.95', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50261', 'job_port': u'0'}
19/12/03 19:01:20 INFO statecache.__init__: Creating state cache with size 0
19/12/03 19:01:20 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43941.
19/12/03 19:01:20 INFO sdk_worker.__init__: Control channel established.
19/12/03 19:01:20 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/03 19:01:20 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 19:01:20 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44587.
19/12/03 19:01:20 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 19:01:20 INFO data_plane.create_data_channel: Creating client data channel for localhost:35165
19/12/03 19:01:20 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 19:01:20 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 19:01:20 INFO sdk_worker.run: No more requests from control plane
19/12/03 19:01:20 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 19:01:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 19:01:20 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 19:01:20 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 19:01:20 INFO sdk_worker.run: Done consuming work.
19/12/03 19:01:20 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 19:01:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 19:01:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 19:01:20 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 19:01:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 19:01:20 INFO sdk_worker_main.main: Logging handler created.
19/12/03 19:01:20 INFO sdk_worker_main.start: Status HTTP server running at localhost:33915
19/12/03 19:01:20 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 19:01:20 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 19:01:20 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575399675.95_649faa8c-3a04-4926-a03b-27dd2c3e5f7c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 19:01:20 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575399675.95', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50261', 'job_port': u'0'}
19/12/03 19:01:20 INFO statecache.__init__: Creating state cache with size 0
19/12/03 19:01:20 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36813.
19/12/03 19:01:20 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/03 19:01:20 INFO sdk_worker.__init__: Control channel established.
19/12/03 19:01:20 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 19:01:21 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44343.
19/12/03 19:01:21 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 19:01:21 INFO data_plane.create_data_channel: Creating client data channel for localhost:43425
19/12/03 19:01:21 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 19:01:21 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 19:01:21 INFO sdk_worker.run: No more requests from control plane
19/12/03 19:01:21 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 19:01:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 19:01:21 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 19:01:21 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 19:01:21 INFO sdk_worker.run: Done consuming work.
19/12/03 19:01:21 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 19:01:21 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 19:01:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 19:01:21 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575399675.95_649faa8c-3a04-4926-a03b-27dd2c3e5f7c finished.
19/12/03 19:01:21 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/03 19:01:21 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_63263033-cbbc-4326-9f78-f6fd0030a82c","basePath":"/tmp/sparktestcPqWqB"}: {}
java.io.FileNotFoundException: /tmp/sparktestcPqWqB/job_63263033-cbbc-4326-9f78-f6fd0030a82c/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 358, in wait
    delay = min(delay * 2, remaining, .05)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140172163282688)>

======================================================================

# Thread: <Thread(Thread-119, started daemon 140172654675712)>
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):

# Thread: <_MainThread(MainThread, started 140173434124032)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 140172146497280)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-125, started daemon 140172154889984)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140173434124032)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-119, started daemon 140172654675712)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 140172163282688)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575399667.25_2f1240de-f72f-4d6b-84c4-b217ac653cea failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 293.126s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 15s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/hostbjuotr4ii

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1680

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1680/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/03 18:22:56 INFO sdk_worker_main.start: Status HTTP server running at localhost:41071
19/12/03 18:22:56 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 18:22:56 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 18:22:56 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575397373.58_ad87b30f-be4f-4393-94ac-6f006cce3e50', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 18:22:56 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575397373.58', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35285', 'job_port': u'0'}
19/12/03 18:22:56 INFO statecache.__init__: Creating state cache with size 0
19/12/03 18:22:56 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46753.
19/12/03 18:22:56 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/03 18:22:56 INFO sdk_worker.__init__: Control channel established.
19/12/03 18:22:56 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 18:22:56 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34779.
19/12/03 18:22:56 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 18:22:56 INFO data_plane.create_data_channel: Creating client data channel for localhost:46315
19/12/03 18:22:56 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 18:22:56 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 18:22:56 INFO sdk_worker.run: No more requests from control plane
19/12/03 18:22:56 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 18:22:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 18:22:56 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 18:22:56 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 18:22:56 INFO sdk_worker.run: Done consuming work.
19/12/03 18:22:56 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 18:22:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 18:22:57 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 18:22:57 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 18:22:57 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 18:22:57 INFO sdk_worker_main.main: Logging handler created.
19/12/03 18:22:57 INFO sdk_worker_main.start: Status HTTP server running at localhost:44997
19/12/03 18:22:57 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 18:22:57 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 18:22:57 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575397373.58_ad87b30f-be4f-4393-94ac-6f006cce3e50', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 18:22:57 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575397373.58', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35285', 'job_port': u'0'}
19/12/03 18:22:57 INFO statecache.__init__: Creating state cache with size 0
19/12/03 18:22:57 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41191.
19/12/03 18:22:57 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/03 18:22:57 INFO sdk_worker.__init__: Control channel established.
19/12/03 18:22:57 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 18:22:57 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43971.
19/12/03 18:22:57 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 18:22:57 INFO data_plane.create_data_channel: Creating client data channel for localhost:46091
19/12/03 18:22:57 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 18:22:58 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 18:22:58 INFO sdk_worker.run: No more requests from control plane
19/12/03 18:22:58 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 18:22:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 18:22:58 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 18:22:58 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 18:22:58 INFO sdk_worker.run: Done consuming work.
19/12/03 18:22:58 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 18:22:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 18:22:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 18:22:58 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 18:22:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 18:22:58 INFO sdk_worker_main.main: Logging handler created.
19/12/03 18:22:58 INFO sdk_worker_main.start: Status HTTP server running at localhost:37463
19/12/03 18:22:58 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 18:22:58 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 18:22:58 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575397373.58_ad87b30f-be4f-4393-94ac-6f006cce3e50', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 18:22:58 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575397373.58', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35285', 'job_port': u'0'}
19/12/03 18:22:58 INFO statecache.__init__: Creating state cache with size 0
19/12/03 18:22:58 INFO sdk_worker.__init__: Creating insecure control channel for localhost:32847.
19/12/03 18:22:58 INFO sdk_worker.__init__: Control channel established.
19/12/03 18:22:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/03 18:22:58 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 18:22:58 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44681.
19/12/03 18:22:58 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 18:22:58 INFO data_plane.create_data_channel: Creating client data channel for localhost:41993
19/12/03 18:22:58 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 18:22:58 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 18:22:58 INFO sdk_worker.run: No more requests from control plane
19/12/03 18:22:58 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 18:22:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 18:22:58 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 18:22:58 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 18:22:58 INFO sdk_worker.run: Done consuming work.
19/12/03 18:22:58 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 18:22:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 18:22:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 18:22:59 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 18:22:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 18:22:59 INFO sdk_worker_main.main: Logging handler created.
19/12/03 18:22:59 INFO sdk_worker_main.start: Status HTTP server running at localhost:42453
19/12/03 18:22:59 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 18:22:59 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 18:22:59 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575397373.58_ad87b30f-be4f-4393-94ac-6f006cce3e50', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 18:22:59 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575397373.58', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35285', 'job_port': u'0'}
19/12/03 18:22:59 INFO statecache.__init__: Creating state cache with size 0
19/12/03 18:22:59 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36699.
19/12/03 18:22:59 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/03 18:22:59 INFO sdk_worker.__init__: Control channel established.
19/12/03 18:22:59 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 18:22:59 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37289.
19/12/03 18:22:59 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 18:22:59 INFO data_plane.create_data_channel: Creating client data channel for localhost:37039
19/12/03 18:22:59 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 18:22:59 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 18:22:59 INFO sdk_worker.run: No more requests from control plane
19/12/03 18:22:59 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 18:22:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 18:22:59 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 18:22:59 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 18:22:59 INFO sdk_worker.run: Done consuming work.
19/12/03 18:22:59 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 18:22:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 18:23:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 18:23:00 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575397373.58_ad87b30f-be4f-4393-94ac-6f006cce3e50 finished.
19/12/03 18:23:00 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/03 18:23:00 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_6eee6b44-5e40-4cd6-baf0-e6a8e9ddc131","basePath":"/tmp/sparktestBjUqOr"}: {}
java.io.FileNotFoundException: /tmp/sparktestBjUqOr/job_6eee6b44-5e40-4cd6-baf0-e6a8e9ddc131/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140419824506624)>

# Thread: <Thread(Thread-120, started daemon 140419832899328)>

# Thread: <_MainThread(MainThread, started 140420621031168)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140419807196928)>

# Thread: <Thread(Thread-126, started daemon 140419815851776)>

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 140420621031168)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-120, started daemon 140419832899328)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140419824506624)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575397362.88_94fd96ea-8d14-48fa-a799-469c421cbc90 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 341.460s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 33s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/my7zbyazlh7ms

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1679

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1679/display/redirect?page=changes>

Changes:

[kamil.wasilewski] Fixed a bug where the output PCollection was assigned to self.result


------------------------------------------
[...truncated 1.32 MB...]
19/12/03 15:50:45 INFO sdk_worker_main.start: Status HTTP server running at localhost:43901
19/12/03 15:50:45 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 15:50:45 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 15:50:45 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575388242.88_c278ce29-8768-480a-a1af-1a7aa80c9420', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 15:50:45 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575388242.88', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39685', 'job_port': u'0'}
19/12/03 15:50:45 INFO statecache.__init__: Creating state cache with size 0
19/12/03 15:50:45 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38113.
19/12/03 15:50:45 INFO sdk_worker.__init__: Control channel established.
19/12/03 15:50:45 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 15:50:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/03 15:50:45 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34311.
19/12/03 15:50:45 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 15:50:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:42601
19/12/03 15:50:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 15:50:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 15:50:45 INFO sdk_worker.run: No more requests from control plane
19/12/03 15:50:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 15:50:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 15:50:45 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 15:50:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 15:50:45 INFO sdk_worker.run: Done consuming work.
19/12/03 15:50:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 15:50:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 15:50:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 15:50:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 15:50:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 15:50:46 INFO sdk_worker_main.main: Logging handler created.
19/12/03 15:50:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:34583
19/12/03 15:50:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 15:50:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 15:50:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575388242.88_c278ce29-8768-480a-a1af-1a7aa80c9420', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 15:50:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575388242.88', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39685', 'job_port': u'0'}
19/12/03 15:50:46 INFO statecache.__init__: Creating state cache with size 0
19/12/03 15:50:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44497.
19/12/03 15:50:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/03 15:50:46 INFO sdk_worker.__init__: Control channel established.
19/12/03 15:50:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 15:50:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36797.
19/12/03 15:50:46 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 15:50:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:45661
19/12/03 15:50:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 15:50:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 15:50:46 INFO sdk_worker.run: No more requests from control plane
19/12/03 15:50:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 15:50:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 15:50:46 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 15:50:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 15:50:46 INFO sdk_worker.run: Done consuming work.
19/12/03 15:50:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 15:50:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 15:50:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 15:50:46 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 15:50:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 15:50:47 INFO sdk_worker_main.main: Logging handler created.
19/12/03 15:50:47 INFO sdk_worker_main.start: Status HTTP server running at localhost:39605
19/12/03 15:50:47 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 15:50:47 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 15:50:47 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575388242.88_c278ce29-8768-480a-a1af-1a7aa80c9420', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 15:50:47 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575388242.88', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39685', 'job_port': u'0'}
19/12/03 15:50:47 INFO statecache.__init__: Creating state cache with size 0
19/12/03 15:50:47 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45307.
19/12/03 15:50:47 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/03 15:50:47 INFO sdk_worker.__init__: Control channel established.
19/12/03 15:50:47 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 15:50:47 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40885.
19/12/03 15:50:47 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 15:50:47 INFO data_plane.create_data_channel: Creating client data channel for localhost:37659
19/12/03 15:50:47 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 15:50:47 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 15:50:47 INFO sdk_worker.run: No more requests from control plane
19/12/03 15:50:47 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 15:50:47 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 15:50:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 15:50:47 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 15:50:47 INFO sdk_worker.run: Done consuming work.
19/12/03 15:50:47 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 15:50:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 15:50:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 15:50:47 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 15:50:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 15:50:47 INFO sdk_worker_main.main: Logging handler created.
19/12/03 15:50:47 INFO sdk_worker_main.start: Status HTTP server running at localhost:32891
19/12/03 15:50:47 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 15:50:47 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 15:50:47 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575388242.88_c278ce29-8768-480a-a1af-1a7aa80c9420', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 15:50:47 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575388242.88', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39685', 'job_port': u'0'}
19/12/03 15:50:47 INFO statecache.__init__: Creating state cache with size 0
19/12/03 15:50:47 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34941.
19/12/03 15:50:47 INFO sdk_worker.__init__: Control channel established.
19/12/03 15:50:47 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/03 15:50:47 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 15:50:47 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42221.
19/12/03 15:50:47 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 15:50:47 INFO data_plane.create_data_channel: Creating client data channel for localhost:41727
19/12/03 15:50:47 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 15:50:47 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 15:50:47 INFO sdk_worker.run: No more requests from control plane
19/12/03 15:50:47 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 15:50:47 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 15:50:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 15:50:47 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 15:50:47 INFO sdk_worker.run: Done consuming work.
19/12/03 15:50:47 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 15:50:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 15:50:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 15:50:48 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575388242.88_c278ce29-8768-480a-a1af-1a7aa80c9420 finished.
19/12/03 15:50:48 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/03 15:50:48 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_d33d6692-0e6d-47b2-9393-7958c1829329","basePath":"/tmp/sparktestZzRAEU"}: {}
java.io.FileNotFoundException: /tmp/sparktestZzRAEU/job_d33d6692-0e6d-47b2-9393-7958c1829329/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140533544515328)>

# Thread: <Thread(Thread-119, started daemon 140533894498048)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <_MainThread(MainThread, started 140534673737472)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 140533519337216)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-123, started daemon 140533527729920)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apach# Thread: <_MainThread(MainThread, started 140534673737472)>

e_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(Thread-119, started daemon 140533894498048)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575388234.1_95b07355-6f9d-40fc-a6a2-eeb51d7a9e26 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <Thread(wait_until_finish_read, started daemon 140533544515328)>
----------------------------------------------------------------------
Ran 38 tests in 296.757s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 29s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/zab3womupjjgc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1678

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1678/display/redirect>

Changes:


------------------------------------------
[...truncated 1.31 MB...]
19/12/03 12:09:39 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575374977.56_6084a658-2be8-4d03-8ebc-83558061f5a5 on Spark master local
19/12/03 12:09:39 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/03 12:09:39 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575374977.56_6084a658-2be8-4d03-8ebc-83558061f5a5: Pipeline translated successfully. Computing outputs
19/12/03 12:09:39 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 12:09:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 12:09:40 INFO sdk_worker_main.main: Logging handler created.
19/12/03 12:09:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:36999
19/12/03 12:09:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 12:09:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 12:09:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575374977.56_6084a658-2be8-4d03-8ebc-83558061f5a5', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 12:09:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575374977.56', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36381', 'job_port': u'0'}
19/12/03 12:09:40 INFO statecache.__init__: Creating state cache with size 0
19/12/03 12:09:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41905.
19/12/03 12:09:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/03 12:09:40 INFO sdk_worker.__init__: Control channel established.
19/12/03 12:09:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 12:09:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41869.
19/12/03 12:09:40 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 12:09:40 INFO data_plane.create_data_channel: Creating client data channel for localhost:32883
19/12/03 12:09:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 12:09:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 12:09:40 INFO sdk_worker.run: No more requests from control plane
19/12/03 12:09:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 12:09:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 12:09:40 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 12:09:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 12:09:40 INFO sdk_worker.run: Done consuming work.
19/12/03 12:09:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 12:09:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 12:09:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 12:09:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 12:09:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 12:09:41 INFO sdk_worker_main.main: Logging handler created.
19/12/03 12:09:41 INFO sdk_worker_main.start: Status HTTP server running at localhost:41361
19/12/03 12:09:41 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 12:09:41 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 12:09:41 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575374977.56_6084a658-2be8-4d03-8ebc-83558061f5a5', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 12:09:41 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575374977.56', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36381', 'job_port': u'0'}
19/12/03 12:09:41 INFO statecache.__init__: Creating state cache with size 0
19/12/03 12:09:41 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42955.
19/12/03 12:09:41 INFO sdk_worker.__init__: Control channel established.
19/12/03 12:09:41 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/03 12:09:41 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 12:09:41 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42199.
19/12/03 12:09:41 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 12:09:41 INFO data_plane.create_data_channel: Creating client data channel for localhost:41633
19/12/03 12:09:41 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 12:09:41 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 12:09:41 INFO sdk_worker.run: No more requests from control plane
19/12/03 12:09:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 12:09:41 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 12:09:41 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 12:09:41 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 12:09:41 INFO sdk_worker.run: Done consuming work.
19/12/03 12:09:41 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 12:09:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 12:09:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 12:09:41 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 12:09:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 12:09:42 INFO sdk_worker_main.main: Logging handler created.
19/12/03 12:09:42 INFO sdk_worker_main.start: Status HTTP server running at localhost:34061
19/12/03 12:09:42 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 12:09:42 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 12:09:42 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575374977.56_6084a658-2be8-4d03-8ebc-83558061f5a5', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 12:09:42 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575374977.56', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36381', 'job_port': u'0'}
19/12/03 12:09:42 INFO statecache.__init__: Creating state cache with size 0
19/12/03 12:09:42 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44723.
19/12/03 12:09:42 INFO sdk_worker.__init__: Control channel established.
19/12/03 12:09:42 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 12:09:42 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/03 12:09:42 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33791.
19/12/03 12:09:42 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 12:09:42 INFO data_plane.create_data_channel: Creating client data channel for localhost:46877
19/12/03 12:09:42 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 12:09:42 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 12:09:42 INFO sdk_worker.run: No more requests from control plane
19/12/03 12:09:42 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 12:09:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 12:09:42 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 12:09:42 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 12:09:42 INFO sdk_worker.run: Done consuming work.
19/12/03 12:09:42 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 12:09:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 12:09:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 12:09:42 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 12:09:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 12:09:43 INFO sdk_worker_main.main: Logging handler created.
19/12/03 12:09:43 INFO sdk_worker_main.start: Status HTTP server running at localhost:40899
19/12/03 12:09:43 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 12:09:43 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 12:09:43 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575374977.56_6084a658-2be8-4d03-8ebc-83558061f5a5', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 12:09:43 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575374977.56', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36381', 'job_port': u'0'}
19/12/03 12:09:43 INFO statecache.__init__: Creating state cache with size 0
19/12/03 12:09:43 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40241.
19/12/03 12:09:43 INFO sdk_worker.__init__: Control channel established.
19/12/03 12:09:43 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 12:09:43 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/03 12:09:43 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46373.
19/12/03 12:09:43 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 12:09:43 INFO data_plane.create_data_channel: Creating client data channel for localhost:33663
19/12/03 12:09:43 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 12:09:43 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 12:09:43 INFO sdk_worker.run: No more requests from control plane
19/12/03 12:09:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 12:09:43 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 12:09:43 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 12:09:43 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 12:09:43 INFO sdk_worker.run: Done consuming work.
19/12/03 12:09:43 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 12:09:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 12:09:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 12:09:43 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 12:09:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 12:09:43 INFO sdk_worker_main.main: Logging handler created.
19/12/03 12:09:43 INFO sdk_worker_main.start: Status HTTP server running at localhost:34321
19/12/03 12:09:43 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 12:09:43 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 12:09:43 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575374977.56_6084a658-2be8-4d03-8ebc-83558061f5a5', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 12:09:43 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575374977.56', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36381', 'job_port': u'0'}
19/12/03 12:09:43 INFO statecache.__init__: Creating state cache with size 0
19/12/03 12:09:43 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42173.
19/12/03 12:09:43 INFO sdk_worker.__init__: Control channel established.
19/12/03 12:09:43 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/03 12:09:43 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 12:09:43 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46621.
19/12/03 12:09:43 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 12:09:43 INFO data_plane.create_data_channel: Creating client data channel for localhost:34433
19/12/03 12:09:43 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 12:09:43 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 12:09:43 INFO sdk_worker.run: No more requests from control plane
19/12/03 12:09:43 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 12:09:43 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 12:09:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 12:09:43 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 12:09:43 INFO sdk_worker.run: Done consuming work.
19/12/03 12:09:43 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 12:09:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 12:09:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 12:09:44 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575374977.56_6084a658-2be8-4d03-8ebc-83558061f5a5 finished.
19/12/03 12:09:44 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/03 12:09:44 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_40b98026-1784-4613-a2d3-b3ccbd2e8483","basePath":"/tmp/sparktestrOXqAf"}: {}
java.io.FileNotFoundException: /tmp/sparktestrOXqAf/job_40b98026-1784-4613-a2d3-b3ccbd2e8483/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139640728188672)>
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.


# Thread: <Thread(Thread-119, started daemon 139640719795968)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "ap# Thread: <_MainThread(MainThread, started 139641864398592)>
ache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575374968.28_eb6dfdbb-5ea0-497d-b043-186ee848ef7b failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 275.228s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 58s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://scans.gradle.com/s/blpavca62heqy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1677

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1677/display/redirect?page=changes>

Changes:

[chadrik] [BEAM-8523] JobAPI: Give access to timestamped state change history

[chadrik] Rename GetJobStateResponse to JobStateEvent

[chadrik] Move state history utilities to AbstractBeamJob

[chadrik] Small bugfix to FlinkBeamJob job state mapping

[chadrik] Fix existing bugs in AbstractJobServiceServicer

[chadrik] Use timestamp.Timestamp instead of float


------------------------------------------
[...truncated 1.32 MB...]
19/12/03 11:35:02 INFO sdk_worker_main.start: Status HTTP server running at localhost:41335
19/12/03 11:35:02 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 11:35:02 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 11:35:02 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575372899.81_4d5c66d7-175f-482a-966c-31a058f1590f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 11:35:02 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575372899.81', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36757', 'job_port': u'0'}
19/12/03 11:35:02 INFO statecache.__init__: Creating state cache with size 0
19/12/03 11:35:02 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45955.
19/12/03 11:35:02 INFO sdk_worker.__init__: Control channel established.
19/12/03 11:35:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/03 11:35:02 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 11:35:02 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41491.
19/12/03 11:35:02 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 11:35:02 INFO data_plane.create_data_channel: Creating client data channel for localhost:42757
19/12/03 11:35:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 11:35:02 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 11:35:02 INFO sdk_worker.run: No more requests from control plane
19/12/03 11:35:02 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 11:35:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 11:35:02 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 11:35:02 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 11:35:02 INFO sdk_worker.run: Done consuming work.
19/12/03 11:35:02 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 11:35:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 11:35:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 11:35:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 11:35:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 11:35:03 INFO sdk_worker_main.main: Logging handler created.
19/12/03 11:35:03 INFO sdk_worker_main.start: Status HTTP server running at localhost:36193
19/12/03 11:35:03 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 11:35:03 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 11:35:03 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575372899.81_4d5c66d7-175f-482a-966c-31a058f1590f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 11:35:03 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575372899.81', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36757', 'job_port': u'0'}
19/12/03 11:35:03 INFO statecache.__init__: Creating state cache with size 0
19/12/03 11:35:03 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42431.
19/12/03 11:35:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/03 11:35:03 INFO sdk_worker.__init__: Control channel established.
19/12/03 11:35:03 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 11:35:03 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39075.
19/12/03 11:35:03 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 11:35:03 INFO data_plane.create_data_channel: Creating client data channel for localhost:36375
19/12/03 11:35:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 11:35:03 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 11:35:03 INFO sdk_worker.run: No more requests from control plane
19/12/03 11:35:03 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 11:35:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 11:35:03 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 11:35:03 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 11:35:03 INFO sdk_worker.run: Done consuming work.
19/12/03 11:35:03 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 11:35:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 11:35:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 11:35:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 11:35:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 11:35:04 INFO sdk_worker_main.main: Logging handler created.
19/12/03 11:35:04 INFO sdk_worker_main.start: Status HTTP server running at localhost:45601
19/12/03 11:35:04 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 11:35:04 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 11:35:04 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575372899.81_4d5c66d7-175f-482a-966c-31a058f1590f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 11:35:04 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575372899.81', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36757', 'job_port': u'0'}
19/12/03 11:35:04 INFO statecache.__init__: Creating state cache with size 0
19/12/03 11:35:04 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34499.
19/12/03 11:35:04 INFO sdk_worker.__init__: Control channel established.
19/12/03 11:35:04 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 11:35:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/03 11:35:04 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42113.
19/12/03 11:35:04 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 11:35:04 INFO data_plane.create_data_channel: Creating client data channel for localhost:45129
19/12/03 11:35:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 11:35:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 11:35:05 INFO sdk_worker.run: No more requests from control plane
19/12/03 11:35:05 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 11:35:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 11:35:05 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 11:35:05 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 11:35:05 INFO sdk_worker.run: Done consuming work.
19/12/03 11:35:05 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 11:35:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 11:35:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 11:35:05 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 11:35:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 11:35:05 INFO sdk_worker_main.main: Logging handler created.
19/12/03 11:35:05 INFO sdk_worker_main.start: Status HTTP server running at localhost:44193
19/12/03 11:35:05 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 11:35:05 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 11:35:05 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575372899.81_4d5c66d7-175f-482a-966c-31a058f1590f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 11:35:05 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575372899.81', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36757', 'job_port': u'0'}
19/12/03 11:35:05 INFO statecache.__init__: Creating state cache with size 0
19/12/03 11:35:05 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44999.
19/12/03 11:35:05 INFO sdk_worker.__init__: Control channel established.
19/12/03 11:35:05 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 11:35:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/03 11:35:05 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33289.
19/12/03 11:35:05 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 11:35:05 INFO data_plane.create_data_channel: Creating client data channel for localhost:32783
19/12/03 11:35:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 11:35:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 11:35:06 INFO sdk_worker.run: No more requests from control plane
19/12/03 11:35:06 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 11:35:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 11:35:06 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 11:35:06 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 11:35:06 INFO sdk_worker.run: Done consuming work.
19/12/03 11:35:06 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 11:35:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 11:35:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 11:35:06 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575372899.81_4d5c66d7-175f-482a-966c-31a058f1590f finished.
19/12/03 11:35:06 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/03 11:35:06 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_3608f4ff-9bdf-4f8d-b70f-a14a36d94e4c","basePath":"/tmp/sparktestIlqvH6"}: {}
java.io.FileNotFoundException: /tmp/sparktestIlqvH6/job_3608f4ff-9bdf-4f8d-b70f-a14a36d94e4c/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
==================== Timed out after 60 seconds. ====================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)

# Thread: <Thread(wait_until_finish_read, started daemon 140527394232064)>

# Thread: <Thread(Thread-116, started daemon 140527385839360)>

# Thread: <_MainThread(MainThread, started 140528173471488)>
==================== Timed out after 60 seconds. ====================

----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_ru# Thread: <Thread(wait_until_finish_read, started daemon 140527286605568)>
nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:

# Thread: <Thread(Thread-122, started daemon 140527294998272)>

# Thread: <_MainThread(MainThread, started 140528173471488)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-116, started daemon 140527385839360)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140527394232064)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575372888.95_dadbb2de-233f-4879-aec2-1bc4dd6caa0c failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 377.069s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 22s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://scans.gradle.com/s/qppimngrpirts

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1676

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1676/display/redirect?page=changes>

Changes:

[echauchot] [BEAM-8470] Update capability matrix: add Spark Structured Streaming

[echauchot] [BEAM-8470] Update Spark runner page: add Spark Structured Streaming


------------------------------------------
[...truncated 1.44 MB...]
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
l_finish_read, started daemon 139703307187968)>
    for state_response in self._state_stream:

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 139703919560448)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
# Thread: <Thread(wait_until_finish_read, started daemon 139704294000384)>

    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 139703323973376)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
# Thread: <Thread(Thread-132, started daemon 139703332366080)>

    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-116, started daemon 139703944738560)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
# Thread: <Thread(wait_until_finish_read, started daemon 139703340758784)>

# Thread: <Thread(wait_until_finish_read, started daemon 139703902775040)>

    wait_fn(timeout=timeout)
# Thread: <Thread(Thread-120, started daemon 139703927953152)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(Thread-128, started daemon 139703894382336)>

# Thread: <Thread(Thread-136, started daemon 139703315580672)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139702795495168)>

# Thread: <Thread(Thread-144, started daemon 139702803887872)>

# Thread: <_MainThread(MainThread, started 139705073448704)>

# Thread: <Thread(Thread-124, started daemon 139703911167744)>

# Thread: <Thread(wait_until_finish_read, started daemon 139703307187968)>

# Thread: <Thread(Thread-136, started daemon 139703315580672)>

# Thread: <Thread(Thread-128, started daemon 139703894382336)>

# Thread: <Thread(wait_until_finish_read, started daemon 139703340758784)>

# Thread: <Thread(wait_until_finish_read, started daemon 139703902775040)>

# Thread: <Thread(wait_until_finish_read, started daemon 139703290402560)>

# Thread: <Thread(Thread-140, started daemon 139703298795264)>
======================================================================
ERROR: test_pardo_unfusable_side_inputs (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 244, in test_pardo_unfusable_side_inputs
    equal_to([('a', 'a'), ('a', 'b'), ('b', 'a'), ('b', 'b')]))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_windowed_side_inputs (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 181, in test_pardo_windowed_side_inputs
    label='windowed')
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_read (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 578, in test_read
    equal_to(['a', 'b', 'c']))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_reshuffle (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 548, in test_reshuffle
    equal_to([1, 2, 3]))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_check_done_failed (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 470, in test_sdf_with_check_done_failed
    | beam.ParDo(ExpandingStringsDoFn()))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

----------------------------------------------------------------------
Ran 38 tests in 692.383s

FAILED (errors=8, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 14m 4s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/6gsu4h7uagqdy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1675

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1675/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/03 06:11:56 INFO sdk_worker_main.start: Status HTTP server running at localhost:38311
19/12/03 06:11:56 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 06:11:56 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 06:11:56 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575353513.64_92d0cbe6-a7e1-4253-a1f6-6202c20bcd88', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 06:11:56 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575353513.64', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43039', 'job_port': u'0'}
19/12/03 06:11:56 INFO statecache.__init__: Creating state cache with size 0
19/12/03 06:11:56 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43883.
19/12/03 06:11:56 INFO sdk_worker.__init__: Control channel established.
19/12/03 06:11:56 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/03 06:11:56 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 06:11:56 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45281.
19/12/03 06:11:56 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 06:11:56 INFO data_plane.create_data_channel: Creating client data channel for localhost:41371
19/12/03 06:11:56 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 06:11:56 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 06:11:56 INFO sdk_worker.run: No more requests from control plane
19/12/03 06:11:56 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 06:11:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 06:11:56 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 06:11:56 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 06:11:56 INFO sdk_worker.run: Done consuming work.
19/12/03 06:11:56 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 06:11:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 06:11:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 06:11:56 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 06:11:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 06:11:56 INFO sdk_worker_main.main: Logging handler created.
19/12/03 06:11:56 INFO sdk_worker_main.start: Status HTTP server running at localhost:44881
19/12/03 06:11:56 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 06:11:56 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 06:11:56 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575353513.64_92d0cbe6-a7e1-4253-a1f6-6202c20bcd88', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 06:11:56 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575353513.64', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43039', 'job_port': u'0'}
19/12/03 06:11:56 INFO statecache.__init__: Creating state cache with size 0
19/12/03 06:11:56 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38645.
19/12/03 06:11:56 INFO sdk_worker.__init__: Control channel established.
19/12/03 06:11:56 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 06:11:56 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/03 06:11:56 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35489.
19/12/03 06:11:56 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 06:11:56 INFO data_plane.create_data_channel: Creating client data channel for localhost:39659
19/12/03 06:11:56 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 06:11:57 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 06:11:57 INFO sdk_worker.run: No more requests from control plane
19/12/03 06:11:57 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 06:11:57 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 06:11:57 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 06:11:57 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 06:11:57 INFO sdk_worker.run: Done consuming work.
19/12/03 06:11:57 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 06:11:57 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 06:11:57 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 06:11:57 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 06:11:57 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 06:11:57 INFO sdk_worker_main.main: Logging handler created.
19/12/03 06:11:57 INFO sdk_worker_main.start: Status HTTP server running at localhost:43481
19/12/03 06:11:57 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 06:11:57 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 06:11:57 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575353513.64_92d0cbe6-a7e1-4253-a1f6-6202c20bcd88', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 06:11:57 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575353513.64', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43039', 'job_port': u'0'}
19/12/03 06:11:57 INFO statecache.__init__: Creating state cache with size 0
19/12/03 06:11:57 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35193.
19/12/03 06:11:57 INFO sdk_worker.__init__: Control channel established.
19/12/03 06:11:57 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 06:11:57 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/03 06:11:57 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38835.
19/12/03 06:11:57 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 06:11:57 INFO data_plane.create_data_channel: Creating client data channel for localhost:39335
19/12/03 06:11:57 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 06:11:58 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 06:11:58 INFO sdk_worker.run: No more requests from control plane
19/12/03 06:11:58 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 06:11:58 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 06:11:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 06:11:58 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 06:11:58 INFO sdk_worker.run: Done consuming work.
19/12/03 06:11:58 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 06:11:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 06:11:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 06:11:58 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 06:11:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 06:11:58 INFO sdk_worker_main.main: Logging handler created.
19/12/03 06:11:58 INFO sdk_worker_main.start: Status HTTP server running at localhost:35341
19/12/03 06:11:58 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 06:11:58 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 06:11:58 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575353513.64_92d0cbe6-a7e1-4253-a1f6-6202c20bcd88', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 06:11:58 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575353513.64', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43039', 'job_port': u'0'}
19/12/03 06:11:58 INFO statecache.__init__: Creating state cache with size 0
19/12/03 06:11:58 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33683.
19/12/03 06:11:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/03 06:11:58 INFO sdk_worker.__init__: Control channel established.
19/12/03 06:11:58 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 06:11:58 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45869.
19/12/03 06:11:58 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 06:11:58 INFO data_plane.create_data_channel: Creating client data channel for localhost:34973
19/12/03 06:11:58 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 06:11:58 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 06:11:58 INFO sdk_worker.run: No more requests from control plane
19/12/03 06:11:58 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 06:11:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 06:11:58 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 06:11:58 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 06:11:58 INFO sdk_worker.run: Done consuming work.
19/12/03 06:11:58 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 06:11:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 06:11:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 06:11:58 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575353513.64_92d0cbe6-a7e1-4253-a1f6-6202c20bcd88 finished.
19/12/03 06:11:58 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/03 06:11:58 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_3783fbdc-5bf4-491e-9318-5c47b864da7a","basePath":"/tmp/sparktestuuUmOk"}: {}
java.io.FileNotFoundException: /tmp/sparktestuuUmOk/job_3783fbdc-5bf4-491e-9318-5c47b864da7a/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140260562347776)>

# Thread: <Thread(Thread-120, started daemon 140260478547712)>

# Thread: <_MainThread(MainThread, started 140261349848832)>
==================== Timed out after 60 seconds. ====================

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 140260461762304)>

# Thread: <Thread(Thread-126, started daemon 140260470155008)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140261349848832)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-120, started daemon 140260478547712)>

# Thread: <Thread(wait_until_finish_read, started daemon 140260562347776)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575353504.56_adff07a2-9b27-40e5-8a87-ee0a5c63ad35 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 307.245s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 48s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/ohuuxjzlyffj4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1674

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1674/display/redirect?page=changes>

Changes:

[ehudm] [BEAM-7594] Fix flaky filename generation

[ehudm] [BEAM-8842] Disable the correct test


------------------------------------------
[...truncated 1.32 MB...]
19/12/03 04:56:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 04:56:02 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 04:56:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 04:56:03 INFO sdk_worker_main.main: Logging handler created.
19/12/03 04:56:03 INFO sdk_worker_main.start: Status HTTP server running at localhost:43485
19/12/03 04:56:03 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 04:56:03 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 04:56:03 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575348961.03_2a3020b5-f7d9-4a2c-90a3-99f314ad5869', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 04:56:03 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575348961.03', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34033', 'job_port': u'0'}
19/12/03 04:56:03 INFO statecache.__init__: Creating state cache with size 0
19/12/03 04:56:03 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46327.
19/12/03 04:56:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/03 04:56:03 INFO sdk_worker.__init__: Control channel established.
19/12/03 04:56:03 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 04:56:03 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46579.
19/12/03 04:56:03 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 04:56:03 INFO data_plane.create_data_channel: Creating client data channel for localhost:41801
19/12/03 04:56:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 04:56:03 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 04:56:03 INFO sdk_worker.run: No more requests from control plane
19/12/03 04:56:03 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 04:56:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 04:56:03 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 04:56:03 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 04:56:03 INFO sdk_worker.run: Done consuming work.
19/12/03 04:56:03 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 04:56:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 04:56:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 04:56:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 04:56:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 04:56:04 INFO sdk_worker_main.main: Logging handler created.
19/12/03 04:56:04 INFO sdk_worker_main.start: Status HTTP server running at localhost:44809
19/12/03 04:56:04 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 04:56:04 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 04:56:04 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575348961.03_2a3020b5-f7d9-4a2c-90a3-99f314ad5869', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 04:56:04 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575348961.03', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34033', 'job_port': u'0'}
19/12/03 04:56:04 INFO statecache.__init__: Creating state cache with size 0
19/12/03 04:56:04 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45041.
19/12/03 04:56:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/03 04:56:04 INFO sdk_worker.__init__: Control channel established.
19/12/03 04:56:04 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 04:56:04 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33217.
19/12/03 04:56:04 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 04:56:04 INFO data_plane.create_data_channel: Creating client data channel for localhost:37173
19/12/03 04:56:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 04:56:04 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 04:56:04 INFO sdk_worker.run: No more requests from control plane
19/12/03 04:56:04 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 04:56:04 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 04:56:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 04:56:04 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 04:56:04 INFO sdk_worker.run: Done consuming work.
19/12/03 04:56:04 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 04:56:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 04:56:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 04:56:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 04:56:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 04:56:05 INFO sdk_worker_main.main: Logging handler created.
19/12/03 04:56:05 INFO sdk_worker_main.start: Status HTTP server running at localhost:37807
19/12/03 04:56:05 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 04:56:05 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 04:56:05 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575348961.03_2a3020b5-f7d9-4a2c-90a3-99f314ad5869', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 04:56:05 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575348961.03', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34033', 'job_port': u'0'}
19/12/03 04:56:05 INFO statecache.__init__: Creating state cache with size 0
19/12/03 04:56:05 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40929.
19/12/03 04:56:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/03 04:56:05 INFO sdk_worker.__init__: Control channel established.
19/12/03 04:56:05 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 04:56:05 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46855.
19/12/03 04:56:05 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 04:56:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 04:56:05 INFO data_plane.create_data_channel: Creating client data channel for localhost:42829
19/12/03 04:56:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 04:56:05 INFO sdk_worker.run: No more requests from control plane
19/12/03 04:56:05 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 04:56:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 04:56:05 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 04:56:05 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 04:56:05 INFO sdk_worker.run: Done consuming work.
19/12/03 04:56:05 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 04:56:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 04:56:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 04:56:05 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 04:56:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 04:56:05 INFO sdk_worker_main.main: Logging handler created.
19/12/03 04:56:05 INFO sdk_worker_main.start: Status HTTP server running at localhost:32973
19/12/03 04:56:05 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 04:56:05 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 04:56:05 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575348961.03_2a3020b5-f7d9-4a2c-90a3-99f314ad5869', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 04:56:05 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575348961.03', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34033', 'job_port': u'0'}
19/12/03 04:56:05 INFO statecache.__init__: Creating state cache with size 0
19/12/03 04:56:05 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40649.
19/12/03 04:56:05 INFO sdk_worker.__init__: Control channel established.
19/12/03 04:56:05 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 04:56:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/03 04:56:05 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37059.
19/12/03 04:56:05 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 04:56:05 INFO data_plane.create_data_channel: Creating client data channel for localhost:33219
19/12/03 04:56:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 04:56:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 04:56:05 INFO sdk_worker.run: No more requests from control plane
19/12/03 04:56:05 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 04:56:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 04:56:05 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 04:56:05 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 04:56:05 INFO sdk_worker.run: Done consuming work.
19/12/03 04:56:05 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 04:56:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 04:56:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 04:56:06 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575348961.03_2a3020b5-f7d9-4a2c-90a3-99f314ad5869 finished.
19/12/03 04:56:06 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/03 04:56:06 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_eb85b03b-44c5-4090-861a-da8cb6305635","basePath":"/tmp/sparktestgSq5aU"}: {}
java.io.FileNotFoundException: /tmp/sparktestgSq5aU/job_eb85b03b-44c5-4090-861a-da8cb6305635/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
==================== Timed out after 60 seconds. ====================
Traceback (most recent call last):

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
# Thread: <Thread(wait_until_finish_read, started daemon 140457482700544)>

    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-118, started daemon 140457474307840)>

# Thread: <_MainThread(MainThread, started 140458261939968)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 140456839083776)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-124, started daemon 140457456998144)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 140458261939968)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575348952.38_328c2416-8400-4f35-8e58-8cea5540977e failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 308.213s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 32s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/3m5z6hcuwf3pa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1673

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1673/display/redirect>

Changes:


------------------------------------------
[...truncated 1.31 MB...]
19/12/03 03:38:28 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575344307.31_0cc2673a-9ca2-4ff7-af16-607d06812be0 on Spark master local
19/12/03 03:38:28 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/03 03:38:28 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575344307.31_0cc2673a-9ca2-4ff7-af16-607d06812be0: Pipeline translated successfully. Computing outputs
19/12/03 03:38:28 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 03:38:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 03:38:29 INFO sdk_worker_main.main: Logging handler created.
19/12/03 03:38:29 INFO sdk_worker_main.start: Status HTTP server running at localhost:35273
19/12/03 03:38:29 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 03:38:29 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 03:38:29 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575344307.31_0cc2673a-9ca2-4ff7-af16-607d06812be0', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 03:38:29 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575344307.31', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53999', 'job_port': u'0'}
19/12/03 03:38:29 INFO statecache.__init__: Creating state cache with size 0
19/12/03 03:38:29 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40597.
19/12/03 03:38:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/03 03:38:29 INFO sdk_worker.__init__: Control channel established.
19/12/03 03:38:29 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 03:38:29 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41231.
19/12/03 03:38:29 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 03:38:29 INFO data_plane.create_data_channel: Creating client data channel for localhost:42959
19/12/03 03:38:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 03:38:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 03:38:29 INFO sdk_worker.run: No more requests from control plane
19/12/03 03:38:29 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 03:38:29 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 03:38:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:38:29 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 03:38:29 INFO sdk_worker.run: Done consuming work.
19/12/03 03:38:29 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 03:38:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 03:38:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:38:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 03:38:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 03:38:29 INFO sdk_worker_main.main: Logging handler created.
19/12/03 03:38:29 INFO sdk_worker_main.start: Status HTTP server running at localhost:44077
19/12/03 03:38:29 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 03:38:29 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 03:38:29 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575344307.31_0cc2673a-9ca2-4ff7-af16-607d06812be0', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 03:38:29 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575344307.31', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53999', 'job_port': u'0'}
19/12/03 03:38:29 INFO statecache.__init__: Creating state cache with size 0
19/12/03 03:38:29 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44601.
19/12/03 03:38:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/03 03:38:29 INFO sdk_worker.__init__: Control channel established.
19/12/03 03:38:29 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 03:38:29 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40855.
19/12/03 03:38:29 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 03:38:29 INFO data_plane.create_data_channel: Creating client data channel for localhost:33653
19/12/03 03:38:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 03:38:30 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 03:38:30 INFO sdk_worker.run: No more requests from control plane
19/12/03 03:38:30 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 03:38:30 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 03:38:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:38:30 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 03:38:30 INFO sdk_worker.run: Done consuming work.
19/12/03 03:38:30 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 03:38:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 03:38:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:38:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 03:38:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 03:38:30 INFO sdk_worker_main.main: Logging handler created.
19/12/03 03:38:30 INFO sdk_worker_main.start: Status HTTP server running at localhost:41775
19/12/03 03:38:30 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 03:38:30 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 03:38:30 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575344307.31_0cc2673a-9ca2-4ff7-af16-607d06812be0', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 03:38:30 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575344307.31', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53999', 'job_port': u'0'}
19/12/03 03:38:30 INFO statecache.__init__: Creating state cache with size 0
19/12/03 03:38:30 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46037.
19/12/03 03:38:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/03 03:38:30 INFO sdk_worker.__init__: Control channel established.
19/12/03 03:38:30 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 03:38:30 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45403.
19/12/03 03:38:30 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 03:38:30 INFO data_plane.create_data_channel: Creating client data channel for localhost:44897
19/12/03 03:38:30 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 03:38:30 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 03:38:30 INFO sdk_worker.run: No more requests from control plane
19/12/03 03:38:30 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 03:38:30 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 03:38:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:38:30 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 03:38:30 INFO sdk_worker.run: Done consuming work.
19/12/03 03:38:30 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 03:38:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 03:38:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:38:31 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 03:38:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 03:38:31 INFO sdk_worker_main.main: Logging handler created.
19/12/03 03:38:31 INFO sdk_worker_main.start: Status HTTP server running at localhost:36857
19/12/03 03:38:31 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 03:38:31 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 03:38:31 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575344307.31_0cc2673a-9ca2-4ff7-af16-607d06812be0', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 03:38:31 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575344307.31', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53999', 'job_port': u'0'}
19/12/03 03:38:31 INFO statecache.__init__: Creating state cache with size 0
19/12/03 03:38:31 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42177.
19/12/03 03:38:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/03 03:38:31 INFO sdk_worker.__init__: Control channel established.
19/12/03 03:38:31 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 03:38:31 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37347.
19/12/03 03:38:31 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 03:38:31 INFO data_plane.create_data_channel: Creating client data channel for localhost:42025
19/12/03 03:38:31 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 03:38:31 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 03:38:31 INFO sdk_worker.run: No more requests from control plane
19/12/03 03:38:31 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 03:38:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:38:31 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 03:38:31 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 03:38:31 INFO sdk_worker.run: Done consuming work.
19/12/03 03:38:31 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 03:38:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 03:38:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:38:31 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 03:38:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 03:38:32 INFO sdk_worker_main.main: Logging handler created.
19/12/03 03:38:32 INFO sdk_worker_main.start: Status HTTP server running at localhost:36345
19/12/03 03:38:32 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 03:38:32 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 03:38:32 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575344307.31_0cc2673a-9ca2-4ff7-af16-607d06812be0', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 03:38:32 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575344307.31', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53999', 'job_port': u'0'}
19/12/03 03:38:32 INFO statecache.__init__: Creating state cache with size 0
19/12/03 03:38:32 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33699.
19/12/03 03:38:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/03 03:38:32 INFO sdk_worker.__init__: Control channel established.
19/12/03 03:38:32 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 03:38:32 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35675.
19/12/03 03:38:32 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 03:38:32 INFO data_plane.create_data_channel: Creating client data channel for localhost:33887
19/12/03 03:38:32 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 03:38:32 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 03:38:32 INFO sdk_worker.run: No more requests from control plane
19/12/03 03:38:32 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 03:38:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:38:32 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 03:38:32 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 03:38:32 INFO sdk_worker.run: Done consuming work.
19/12/03 03:38:32 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 03:38:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 03:38:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:38:32 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575344307.31_0cc2673a-9ca2-4ff7-af16-607d06812be0 finished.
19/12/03 03:38:32 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/03 03:38:32 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_a9e1fbaa-8200-4174-9cc9-bc8efd84af98","basePath":"/tmp/sparktestKpO9GN"}: {}
java.io.FileNotFoundException: /tmp/sparktestKpO9GN/job_a9e1fbaa-8200-4174-9cc9-bc8efd84af98/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140097127180032)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-115, started daemon 140097110394624)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
# Thread: <_MainThread(MainThread, started 140097981904640)>
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575344298.04_d3c5337f-5af4-4db2-9cd2-9b62c0e077c1 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 288.880s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 20s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/slo47yhmrgdyk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1672

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1672/display/redirect?page=changes>

Changes:

[altay] Increase overhaed budget for test_sampler_transition_overhead

[aaltay] [BEAM-8814] Changed no_auth option from bool to store_true (#10202)


------------------------------------------
[...truncated 1.32 MB...]
19/12/03 03:26:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:26:20 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 03:26:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 03:26:20 INFO sdk_worker_main.main: Logging handler created.
19/12/03 03:26:20 INFO sdk_worker_main.start: Status HTTP server running at localhost:40473
19/12/03 03:26:20 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 03:26:20 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 03:26:20 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575343578.26_87a06f0d-ad4f-4dc3-8519-e4c9a40ef0ef', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 03:26:20 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575343578.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42045', 'job_port': u'0'}
19/12/03 03:26:20 INFO statecache.__init__: Creating state cache with size 0
19/12/03 03:26:20 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43083.
19/12/03 03:26:20 INFO sdk_worker.__init__: Control channel established.
19/12/03 03:26:20 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/03 03:26:20 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 03:26:20 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40017.
19/12/03 03:26:20 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 03:26:20 INFO data_plane.create_data_channel: Creating client data channel for localhost:44987
19/12/03 03:26:20 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 03:26:20 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 03:26:21 INFO sdk_worker.run: No more requests from control plane
19/12/03 03:26:21 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 03:26:21 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 03:26:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:26:21 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 03:26:21 INFO sdk_worker.run: Done consuming work.
19/12/03 03:26:21 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 03:26:21 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 03:26:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:26:21 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 03:26:21 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 03:26:21 INFO sdk_worker_main.main: Logging handler created.
19/12/03 03:26:21 INFO sdk_worker_main.start: Status HTTP server running at localhost:36621
19/12/03 03:26:21 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 03:26:21 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 03:26:21 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575343578.26_87a06f0d-ad4f-4dc3-8519-e4c9a40ef0ef', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 03:26:21 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575343578.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42045', 'job_port': u'0'}
19/12/03 03:26:21 INFO statecache.__init__: Creating state cache with size 0
19/12/03 03:26:21 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41435.
19/12/03 03:26:21 INFO sdk_worker.__init__: Control channel established.
19/12/03 03:26:21 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 03:26:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/03 03:26:21 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34969.
19/12/03 03:26:21 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 03:26:21 INFO data_plane.create_data_channel: Creating client data channel for localhost:34289
19/12/03 03:26:21 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 03:26:21 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 03:26:21 INFO sdk_worker.run: No more requests from control plane
19/12/03 03:26:21 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 03:26:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:26:21 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 03:26:21 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 03:26:21 INFO sdk_worker.run: Done consuming work.
19/12/03 03:26:21 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 03:26:21 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 03:26:22 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:26:22 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 03:26:22 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 03:26:22 INFO sdk_worker_main.main: Logging handler created.
19/12/03 03:26:22 INFO sdk_worker_main.start: Status HTTP server running at localhost:44889
19/12/03 03:26:22 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 03:26:22 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 03:26:22 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575343578.26_87a06f0d-ad4f-4dc3-8519-e4c9a40ef0ef', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 03:26:22 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575343578.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42045', 'job_port': u'0'}
19/12/03 03:26:22 INFO statecache.__init__: Creating state cache with size 0
19/12/03 03:26:22 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38269.
19/12/03 03:26:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/03 03:26:22 INFO sdk_worker.__init__: Control channel established.
19/12/03 03:26:22 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 03:26:22 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42941.
19/12/03 03:26:22 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 03:26:22 INFO data_plane.create_data_channel: Creating client data channel for localhost:35953
19/12/03 03:26:22 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 03:26:22 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 03:26:22 INFO sdk_worker.run: No more requests from control plane
19/12/03 03:26:22 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 03:26:22 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 03:26:22 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:26:22 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 03:26:22 INFO sdk_worker.run: Done consuming work.
19/12/03 03:26:22 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 03:26:22 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 03:26:22 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:26:22 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 03:26:23 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 03:26:23 INFO sdk_worker_main.main: Logging handler created.
19/12/03 03:26:23 INFO sdk_worker_main.start: Status HTTP server running at localhost:37481
19/12/03 03:26:23 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 03:26:23 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 03:26:23 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575343578.26_87a06f0d-ad4f-4dc3-8519-e4c9a40ef0ef', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 03:26:23 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575343578.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42045', 'job_port': u'0'}
19/12/03 03:26:23 INFO statecache.__init__: Creating state cache with size 0
19/12/03 03:26:23 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44515.
19/12/03 03:26:23 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/03 03:26:23 INFO sdk_worker.__init__: Control channel established.
19/12/03 03:26:23 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 03:26:23 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34535.
19/12/03 03:26:23 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 03:26:23 INFO data_plane.create_data_channel: Creating client data channel for localhost:36537
19/12/03 03:26:23 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 03:26:23 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 03:26:23 INFO sdk_worker.run: No more requests from control plane
19/12/03 03:26:23 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 03:26:23 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 03:26:23 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:26:23 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 03:26:23 INFO sdk_worker.run: Done consuming work.
19/12/03 03:26:23 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 03:26:23 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 03:26:23 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:26:23 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575343578.26_87a06f0d-ad4f-4dc3-8519-e4c9a40ef0ef finished.
19/12/03 03:26:23 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/03 03:26:23 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_884a85e9-a9a2-4444-a033-ac3d45acd999","basePath":"/tmp/sparktestji0bvb"}: {}
java.io.FileNotFoundException: /tmp/sparktestji0bvb/job_884a85e9-a9a2-4444-a033-ac3d45acd999/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
==================== Timed out after 60 seconds. ====================
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 140416640874240)>


======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
# Thread: <Thread(Thread-119, started daemon 140416988854016)>

Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_ru# Thread: <_MainThread(MainThread, started 140417768093440)>
nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 140416624088832)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-125, started daemon 140416615696128)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 140417768093440)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575343568.19_6cdcb876-5d7e-4311-9d5e-3b811cbea544 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 297.314s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 47s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/bgkbtbm5mdjic

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1671

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1671/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-8863] experiment=beam_fn_api in runtime/environments page


------------------------------------------
[...truncated 1.32 MB...]
19/12/03 03:09:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:09:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 03:09:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 03:09:27 INFO sdk_worker_main.main: Logging handler created.
19/12/03 03:09:27 INFO sdk_worker_main.start: Status HTTP server running at localhost:35947
19/12/03 03:09:27 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 03:09:27 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 03:09:27 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575342565.27_f8ea50c7-0c3f-40e5-8dc8-ceb78d4d89d7', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 03:09:27 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575342565.27', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37941', 'job_port': u'0'}
19/12/03 03:09:27 INFO statecache.__init__: Creating state cache with size 0
19/12/03 03:09:27 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34729.
19/12/03 03:09:27 INFO sdk_worker.__init__: Control channel established.
19/12/03 03:09:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/03 03:09:27 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 03:09:27 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45231.
19/12/03 03:09:27 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 03:09:27 INFO data_plane.create_data_channel: Creating client data channel for localhost:38925
19/12/03 03:09:27 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 03:09:28 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 03:09:28 INFO sdk_worker.run: No more requests from control plane
19/12/03 03:09:28 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 03:09:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:09:28 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 03:09:28 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 03:09:28 INFO sdk_worker.run: Done consuming work.
19/12/03 03:09:28 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 03:09:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 03:09:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:09:28 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 03:09:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 03:09:28 INFO sdk_worker_main.main: Logging handler created.
19/12/03 03:09:28 INFO sdk_worker_main.start: Status HTTP server running at localhost:35263
19/12/03 03:09:28 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 03:09:28 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 03:09:28 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575342565.27_f8ea50c7-0c3f-40e5-8dc8-ceb78d4d89d7', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 03:09:28 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575342565.27', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37941', 'job_port': u'0'}
19/12/03 03:09:28 INFO statecache.__init__: Creating state cache with size 0
19/12/03 03:09:28 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42421.
19/12/03 03:09:28 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/03 03:09:28 INFO sdk_worker.__init__: Control channel established.
19/12/03 03:09:28 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 03:09:28 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34723.
19/12/03 03:09:28 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 03:09:28 INFO data_plane.create_data_channel: Creating client data channel for localhost:46359
19/12/03 03:09:28 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 03:09:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 03:09:29 INFO sdk_worker.run: No more requests from control plane
19/12/03 03:09:29 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 03:09:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:09:29 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 03:09:29 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 03:09:29 INFO sdk_worker.run: Done consuming work.
19/12/03 03:09:29 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 03:09:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 03:09:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:09:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 03:09:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 03:09:29 INFO sdk_worker_main.main: Logging handler created.
19/12/03 03:09:29 INFO sdk_worker_main.start: Status HTTP server running at localhost:45717
19/12/03 03:09:29 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 03:09:29 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 03:09:29 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575342565.27_f8ea50c7-0c3f-40e5-8dc8-ceb78d4d89d7', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 03:09:29 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575342565.27', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37941', 'job_port': u'0'}
19/12/03 03:09:29 INFO statecache.__init__: Creating state cache with size 0
19/12/03 03:09:29 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39369.
19/12/03 03:09:29 INFO sdk_worker.__init__: Control channel established.
19/12/03 03:09:29 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 03:09:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/03 03:09:29 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45855.
19/12/03 03:09:29 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 03:09:29 INFO data_plane.create_data_channel: Creating client data channel for localhost:42483
19/12/03 03:09:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 03:09:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 03:09:29 INFO sdk_worker.run: No more requests from control plane
19/12/03 03:09:29 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 03:09:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:09:29 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 03:09:29 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 03:09:29 INFO sdk_worker.run: Done consuming work.
19/12/03 03:09:29 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 03:09:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 03:09:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:09:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 03:09:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 03:09:30 INFO sdk_worker_main.main: Logging handler created.
19/12/03 03:09:30 INFO sdk_worker_main.start: Status HTTP server running at localhost:33103
19/12/03 03:09:30 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 03:09:30 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 03:09:30 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575342565.27_f8ea50c7-0c3f-40e5-8dc8-ceb78d4d89d7', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 03:09:30 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575342565.27', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37941', 'job_port': u'0'}
19/12/03 03:09:30 INFO statecache.__init__: Creating state cache with size 0
19/12/03 03:09:30 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46477.
19/12/03 03:09:30 INFO sdk_worker.__init__: Control channel established.
19/12/03 03:09:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/03 03:09:30 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 03:09:30 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36689.
19/12/03 03:09:30 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 03:09:30 INFO data_plane.create_data_channel: Creating client data channel for localhost:42155
19/12/03 03:09:30 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 03:09:30 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 03:09:30 INFO sdk_worker.run: No more requests from control plane
19/12/03 03:09:30 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 03:09:30 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 03:09:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:09:30 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 03:09:30 INFO sdk_worker.run: Done consuming work.
19/12/03 03:09:30 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 03:09:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 03:09:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:09:31 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575342565.27_f8ea50c7-0c3f-40e5-8dc8-ceb78d4d89d7 finished.
19/12/03 03:09:31 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/03 03:09:31 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_982fecfb-14df-4eb0-b936-52f71a96ac17","basePath":"/tmp/sparktestZwXDPf"}: {}
java.io.FileNotFoundException: /tmp/sparktestZwXDPf/job_982fecfb-14df-4eb0-b936-52f71a96ac17/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_assert_that (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 101, in test_assert_that
    assert_that(p | beam.Create(['a', 'b']), equal_to(['a']))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
==================== Timed out after 60 seconds. ====================
Traceback (most recent call last):

  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
# Thread: <Thread(wait_until_finish_read, started daemon 140664983860992)>

# Thread: <Thread(Thread-5, started daemon 140664992253696)>

    self.run().wait_until_finish()
# Thread: <_MainThread(MainThread, started 140665773664000)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 140664983860992)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-115, started daemon 140664967075584)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 140665773664000)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575342553.69_6b0bf8de-7c5e-4a82-96ea-7a2097f23dff failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 385.950s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 0s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/ph6yewtsraoay

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1670

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1670/display/redirect?page=changes>

Changes:

[lcwik] [BEAM-2929] Ensure that the Beam Java SDK sends the property

[lcwik] [BEAM-2929] Ensure that the Beam Go SDK sends the property

[lcwik] [BEAM-2929] Ensure that the Beam Python SDK sends the property

[lostluck] [BEAM-2929] Fix go code format for


------------------------------------------
[...truncated 1.32 MB...]
19/12/03 00:11:06 INFO sdk_worker_main.start: Status HTTP server running at localhost:33599
19/12/03 00:11:06 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 00:11:06 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 00:11:06 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575331864.06_31c3dda5-0f98-4a2f-a9c6-8be47e52c740', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 00:11:06 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575331864.06', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43113', 'job_port': u'0'}
19/12/03 00:11:06 INFO statecache.__init__: Creating state cache with size 0
19/12/03 00:11:06 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37981.
19/12/03 00:11:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/03 00:11:06 INFO sdk_worker.__init__: Control channel established.
19/12/03 00:11:06 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 00:11:06 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46153.
19/12/03 00:11:06 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 00:11:06 INFO data_plane.create_data_channel: Creating client data channel for localhost:44163
19/12/03 00:11:06 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 00:11:06 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 00:11:06 INFO sdk_worker.run: No more requests from control plane
19/12/03 00:11:06 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 00:11:06 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 00:11:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 00:11:06 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 00:11:06 INFO sdk_worker.run: Done consuming work.
19/12/03 00:11:06 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 00:11:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 00:11:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 00:11:07 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 00:11:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 00:11:07 INFO sdk_worker_main.main: Logging handler created.
19/12/03 00:11:07 INFO sdk_worker_main.start: Status HTTP server running at localhost:35101
19/12/03 00:11:07 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 00:11:07 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 00:11:07 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575331864.06_31c3dda5-0f98-4a2f-a9c6-8be47e52c740', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 00:11:07 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575331864.06', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43113', 'job_port': u'0'}
19/12/03 00:11:07 INFO statecache.__init__: Creating state cache with size 0
19/12/03 00:11:07 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37429.
19/12/03 00:11:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/03 00:11:07 INFO sdk_worker.__init__: Control channel established.
19/12/03 00:11:07 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 00:11:07 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43551.
19/12/03 00:11:07 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 00:11:07 INFO data_plane.create_data_channel: Creating client data channel for localhost:34943
19/12/03 00:11:07 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 00:11:07 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 00:11:07 INFO sdk_worker.run: No more requests from control plane
19/12/03 00:11:07 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 00:11:07 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 00:11:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 00:11:07 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 00:11:07 INFO sdk_worker.run: Done consuming work.
19/12/03 00:11:07 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 00:11:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 00:11:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 00:11:08 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 00:11:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 00:11:08 INFO sdk_worker_main.main: Logging handler created.
19/12/03 00:11:08 INFO sdk_worker_main.start: Status HTTP server running at localhost:38241
19/12/03 00:11:08 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 00:11:08 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 00:11:08 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575331864.06_31c3dda5-0f98-4a2f-a9c6-8be47e52c740', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 00:11:08 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575331864.06', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43113', 'job_port': u'0'}
19/12/03 00:11:08 INFO statecache.__init__: Creating state cache with size 0
19/12/03 00:11:08 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43349.
19/12/03 00:11:08 INFO sdk_worker.__init__: Control channel established.
19/12/03 00:11:08 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 00:11:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/03 00:11:08 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40377.
19/12/03 00:11:08 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 00:11:08 INFO data_plane.create_data_channel: Creating client data channel for localhost:37203
19/12/03 00:11:08 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 00:11:08 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 00:11:08 INFO sdk_worker.run: No more requests from control plane
19/12/03 00:11:08 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 00:11:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 00:11:08 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 00:11:08 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 00:11:08 INFO sdk_worker.run: Done consuming work.
19/12/03 00:11:08 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 00:11:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 00:11:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 00:11:09 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 00:11:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 00:11:09 INFO sdk_worker_main.main: Logging handler created.
19/12/03 00:11:09 INFO sdk_worker_main.start: Status HTTP server running at localhost:43493
19/12/03 00:11:09 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 00:11:09 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 00:11:09 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575331864.06_31c3dda5-0f98-4a2f-a9c6-8be47e52c740', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 00:11:09 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575331864.06', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43113', 'job_port': u'0'}
19/12/03 00:11:09 INFO statecache.__init__: Creating state cache with size 0
19/12/03 00:11:09 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46075.
19/12/03 00:11:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/03 00:11:09 INFO sdk_worker.__init__: Control channel established.
19/12/03 00:11:09 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 00:11:09 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37617.
19/12/03 00:11:09 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 00:11:09 INFO data_plane.create_data_channel: Creating client data channel for localhost:41875
19/12/03 00:11:09 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 00:11:09 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 00:11:09 INFO sdk_worker.run: No more requests from control plane
19/12/03 00:11:09 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 00:11:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 00:11:09 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 00:11:09 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 00:11:09 INFO sdk_worker.run: Done consuming work.
19/12/03 00:11:09 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 00:11:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 00:11:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 00:11:09 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575331864.06_31c3dda5-0f98-4a2f-a9c6-8be47e52c740 finished.
19/12/03 00:11:09 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/03 00:11:09 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_27aee76c-7c13-49eb-9c1f-4c2dde96b503","basePath":"/tmp/sparktestmA8Uhd"}: {}
java.io.FileNotFoundException: /tmp/sparktestmA8Uhd/job_27aee76c-7c13-49eb-9c1f-4c2dde96b503/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140191893288704)>

# Thread: <Thread(Thread-119, started daemon 140191978215168)>

# Thread: <_MainThread(MainThread, started 140192757323520)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140191876241152)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)

# Thread: <Thread(Thread-125, started daemon 140191884896000)>

# Thread: <Thread(Thread-119, started daemon 140191978215168)>

# Thread: <_MainThread(MainThread, started 140192757323520)>

# Thread: <Thread(wait_until_finish_read, started daemon 140191893288704)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575331854.27_c7750d1d-f9f2-4853-8d7e-0a77302ac586 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 320.375s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 1s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/fq5drf6ofabrq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1669

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1669/display/redirect?page=changes>

Changes:

[ehudm] [BEAM-4132] Set multi-output PCollections types to Any

[migryz] Bump Release Build Timeout

[migryz] fix syntax

[github] Bump time to 5 hours.

[robertwb] [BEAM-8645] A test case for TimestampCombiner. (#10081)


------------------------------------------
[...truncated 1.32 MB...]
19/12/02 23:07:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 23:07:38 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 23:07:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 23:07:39 INFO sdk_worker_main.main: Logging handler created.
19/12/02 23:07:39 INFO sdk_worker_main.start: Status HTTP server running at localhost:35671
19/12/02 23:07:39 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 23:07:39 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 23:07:39 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575328056.3_afcf8989-ddf5-44c2-9134-a79fd6e0b154', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 23:07:39 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575328056.3', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52653', 'job_port': u'0'}
19/12/02 23:07:39 INFO statecache.__init__: Creating state cache with size 0
19/12/02 23:07:39 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34507.
19/12/02 23:07:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/02 23:07:39 INFO sdk_worker.__init__: Control channel established.
19/12/02 23:07:39 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 23:07:39 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45819.
19/12/02 23:07:39 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 23:07:39 INFO data_plane.create_data_channel: Creating client data channel for localhost:45147
19/12/02 23:07:39 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 23:07:39 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 23:07:39 INFO sdk_worker.run: No more requests from control plane
19/12/02 23:07:39 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 23:07:39 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 23:07:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 23:07:39 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 23:07:39 INFO sdk_worker.run: Done consuming work.
19/12/02 23:07:39 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 23:07:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 23:07:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 23:07:39 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 23:07:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 23:07:40 INFO sdk_worker_main.main: Logging handler created.
19/12/02 23:07:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:37879
19/12/02 23:07:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 23:07:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 23:07:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575328056.3_afcf8989-ddf5-44c2-9134-a79fd6e0b154', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 23:07:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575328056.3', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52653', 'job_port': u'0'}
19/12/02 23:07:40 INFO statecache.__init__: Creating state cache with size 0
19/12/02 23:07:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42563.
19/12/02 23:07:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/02 23:07:40 INFO sdk_worker.__init__: Control channel established.
19/12/02 23:07:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 23:07:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40577.
19/12/02 23:07:40 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 23:07:40 INFO data_plane.create_data_channel: Creating client data channel for localhost:36769
19/12/02 23:07:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 23:07:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 23:07:40 INFO sdk_worker.run: No more requests from control plane
19/12/02 23:07:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 23:07:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 23:07:40 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 23:07:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 23:07:40 INFO sdk_worker.run: Done consuming work.
19/12/02 23:07:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 23:07:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 23:07:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 23:07:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 23:07:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 23:07:41 INFO sdk_worker_main.main: Logging handler created.
19/12/02 23:07:41 INFO sdk_worker_main.start: Status HTTP server running at localhost:45151
19/12/02 23:07:41 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 23:07:41 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 23:07:41 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575328056.3_afcf8989-ddf5-44c2-9134-a79fd6e0b154', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 23:07:41 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575328056.3', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52653', 'job_port': u'0'}
19/12/02 23:07:41 INFO statecache.__init__: Creating state cache with size 0
19/12/02 23:07:41 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38761.
19/12/02 23:07:41 INFO sdk_worker.__init__: Control channel established.
19/12/02 23:07:41 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 23:07:41 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/02 23:07:41 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40951.
19/12/02 23:07:41 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 23:07:41 INFO data_plane.create_data_channel: Creating client data channel for localhost:39553
19/12/02 23:07:41 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 23:07:41 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 23:07:41 INFO sdk_worker.run: No more requests from control plane
19/12/02 23:07:41 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 23:07:41 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 23:07:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 23:07:41 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 23:07:41 INFO sdk_worker.run: Done consuming work.
19/12/02 23:07:41 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 23:07:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 23:07:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 23:07:41 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 23:07:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 23:07:42 INFO sdk_worker_main.main: Logging handler created.
19/12/02 23:07:42 INFO sdk_worker_main.start: Status HTTP server running at localhost:43565
19/12/02 23:07:42 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 23:07:42 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 23:07:42 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575328056.3_afcf8989-ddf5-44c2-9134-a79fd6e0b154', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 23:07:42 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575328056.3', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52653', 'job_port': u'0'}
19/12/02 23:07:42 INFO statecache.__init__: Creating state cache with size 0
19/12/02 23:07:42 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33313.
19/12/02 23:07:42 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/02 23:07:42 INFO sdk_worker.__init__: Control channel established.
19/12/02 23:07:42 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 23:07:42 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37805.
19/12/02 23:07:42 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 23:07:42 INFO data_plane.create_data_channel: Creating client data channel for localhost:44857
19/12/02 23:07:42 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 23:07:42 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 23:07:42 INFO sdk_worker.run: No more requests from control plane
19/12/02 23:07:42 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 23:07:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 23:07:42 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 23:07:42 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 23:07:42 INFO sdk_worker.run: Done consuming work.
19/12/02 23:07:42 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 23:07:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 23:07:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 23:07:42 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575328056.3_afcf8989-ddf5-44c2-9134-a79fd6e0b154 finished.
19/12/02 23:07:42 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/02 23:07:42 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_4244f732-b366-4b56-bb49-1a8d2662e9b2","basePath":"/tmp/sparktestBf4Pjn"}: {}
java.io.FileNotFoundException: /tmp/sparktestBf4Pjn/job_4244f732-b366-4b56-bb49-1a8d2662e9b2/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140069438170880)>

Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-118, started daemon 140069352503040)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140070217279232)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140069344110336)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-124, started daemon 140069335717632)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140070217279232)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575328045.52_bcf643e3-9a9c-4eaa-a471-f4b6e7dc7b11 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 305.006s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 17s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/24s5jlkvigxvk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1668

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1668/display/redirect?page=changes>

Changes:

[sunjincheng121] [BEAM-8733]  Handle the registration request synchronously in the Python


------------------------------------------
[...truncated 1.32 MB...]
19/12/02 19:52:06 INFO sdk_worker_main.start: Status HTTP server running at localhost:44005
19/12/02 19:52:06 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 19:52:06 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 19:52:06 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575316323.07_6046d00c-fff4-4db8-b431-caf957a07f07', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 19:52:06 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575316323.07', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46905', 'job_port': u'0'}
19/12/02 19:52:06 INFO statecache.__init__: Creating state cache with size 0
19/12/02 19:52:06 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40443.
19/12/02 19:52:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/02 19:52:06 INFO sdk_worker.__init__: Control channel established.
19/12/02 19:52:06 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 19:52:06 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39951.
19/12/02 19:52:06 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 19:52:06 INFO data_plane.create_data_channel: Creating client data channel for localhost:42195
19/12/02 19:52:06 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 19:52:06 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 19:52:06 INFO sdk_worker.run: No more requests from control plane
19/12/02 19:52:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 19:52:06 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 19:52:06 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 19:52:06 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 19:52:06 INFO sdk_worker.run: Done consuming work.
19/12/02 19:52:06 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 19:52:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 19:52:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 19:52:06 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 19:52:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 19:52:07 INFO sdk_worker_main.main: Logging handler created.
19/12/02 19:52:07 INFO sdk_worker_main.start: Status HTTP server running at localhost:43457
19/12/02 19:52:07 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 19:52:07 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 19:52:07 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575316323.07_6046d00c-fff4-4db8-b431-caf957a07f07', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 19:52:07 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575316323.07', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46905', 'job_port': u'0'}
19/12/02 19:52:07 INFO statecache.__init__: Creating state cache with size 0
19/12/02 19:52:07 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41305.
19/12/02 19:52:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/02 19:52:07 INFO sdk_worker.__init__: Control channel established.
19/12/02 19:52:07 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 19:52:07 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41275.
19/12/02 19:52:07 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 19:52:07 INFO data_plane.create_data_channel: Creating client data channel for localhost:33861
19/12/02 19:52:07 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 19:52:07 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 19:52:07 INFO sdk_worker.run: No more requests from control plane
19/12/02 19:52:07 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 19:52:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 19:52:07 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 19:52:07 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 19:52:07 INFO sdk_worker.run: Done consuming work.
19/12/02 19:52:07 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 19:52:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 19:52:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 19:52:07 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 19:52:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 19:52:08 INFO sdk_worker_main.main: Logging handler created.
19/12/02 19:52:08 INFO sdk_worker_main.start: Status HTTP server running at localhost:37131
19/12/02 19:52:08 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 19:52:08 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 19:52:08 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575316323.07_6046d00c-fff4-4db8-b431-caf957a07f07', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 19:52:08 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575316323.07', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46905', 'job_port': u'0'}
19/12/02 19:52:08 INFO statecache.__init__: Creating state cache with size 0
19/12/02 19:52:08 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46441.
19/12/02 19:52:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/02 19:52:08 INFO sdk_worker.__init__: Control channel established.
19/12/02 19:52:08 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 19:52:08 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36415.
19/12/02 19:52:08 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 19:52:08 INFO data_plane.create_data_channel: Creating client data channel for localhost:39377
19/12/02 19:52:08 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 19:52:08 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 19:52:08 INFO sdk_worker.run: No more requests from control plane
19/12/02 19:52:08 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 19:52:08 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 19:52:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 19:52:08 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 19:52:08 INFO sdk_worker.run: Done consuming work.
19/12/02 19:52:08 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 19:52:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 19:52:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 19:52:08 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 19:52:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 19:52:09 INFO sdk_worker_main.main: Logging handler created.
19/12/02 19:52:09 INFO sdk_worker_main.start: Status HTTP server running at localhost:44675
19/12/02 19:52:09 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 19:52:09 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 19:52:09 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575316323.07_6046d00c-fff4-4db8-b431-caf957a07f07', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 19:52:09 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575316323.07', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46905', 'job_port': u'0'}
19/12/02 19:52:09 INFO statecache.__init__: Creating state cache with size 0
19/12/02 19:52:09 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43365.
19/12/02 19:52:09 INFO sdk_worker.__init__: Control channel established.
19/12/02 19:52:09 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 19:52:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/02 19:52:09 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:32889.
19/12/02 19:52:09 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 19:52:09 INFO data_plane.create_data_channel: Creating client data channel for localhost:45293
19/12/02 19:52:09 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 19:52:09 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 19:52:09 INFO sdk_worker.run: No more requests from control plane
19/12/02 19:52:09 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 19:52:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 19:52:09 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 19:52:09 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 19:52:09 INFO sdk_worker.run: Done consuming work.
19/12/02 19:52:09 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 19:52:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 19:52:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 19:52:09 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575316323.07_6046d00c-fff4-4db8-b431-caf957a07f07 finished.
19/12/02 19:52:09 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/02 19:52:09 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_18efc3f0-0527-47af-bcfd-e4553086af64","basePath":"/tmp/sparktestzSz0RX"}: {}
java.io.FileNotFoundException: /tmp/sparktestzSz0RX/job_18efc3f0-0527-47af-bcfd-e4553086af64/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_ru# Thread: <Thread(wait_until_finish_read, started daemon 140353306609408)>

nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-116, started daemon 140353323394816)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140354102503168)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140352811951872)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-122, started daemon 140352820344576)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <_MainThread(MainThread, started 140354102503168)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
# Thread: <Thread(wait_until_finish_read, started daemon 140353306609408)>

BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-116, started daemon 140353323394816)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575316310.32_8df681ce-4d54-4250-90ec-a1e9e39e5651 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 351.412s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 9s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/iwdvlnwg6euhk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1667

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1667/display/redirect>

Changes:


------------------------------------------
[...truncated 1.31 MB...]
19/12/02 18:14:05 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575310444.26_4dc2b39c-59c1-44dd-8e8c-c812a503ff41 on Spark master local
19/12/02 18:14:05 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/02 18:14:05 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575310444.26_4dc2b39c-59c1-44dd-8e8c-c812a503ff41: Pipeline translated successfully. Computing outputs
19/12/02 18:14:05 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 18:14:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 18:14:05 INFO sdk_worker_main.main: Logging handler created.
19/12/02 18:14:05 INFO sdk_worker_main.start: Status HTTP server running at localhost:43165
19/12/02 18:14:05 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 18:14:05 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 18:14:05 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575310444.26_4dc2b39c-59c1-44dd-8e8c-c812a503ff41', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 18:14:05 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575310444.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53973', 'job_port': u'0'}
19/12/02 18:14:05 INFO statecache.__init__: Creating state cache with size 0
19/12/02 18:14:05 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35021.
19/12/02 18:14:05 INFO sdk_worker.__init__: Control channel established.
19/12/02 18:14:05 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 18:14:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/02 18:14:05 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42059.
19/12/02 18:14:05 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 18:14:05 INFO data_plane.create_data_channel: Creating client data channel for localhost:46473
19/12/02 18:14:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 18:14:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 18:14:06 INFO sdk_worker.run: No more requests from control plane
19/12/02 18:14:06 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 18:14:06 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 18:14:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 18:14:06 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 18:14:06 INFO sdk_worker.run: Done consuming work.
19/12/02 18:14:06 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 18:14:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 18:14:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 18:14:06 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 18:14:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 18:14:06 INFO sdk_worker_main.main: Logging handler created.
19/12/02 18:14:06 INFO sdk_worker_main.start: Status HTTP server running at localhost:41317
19/12/02 18:14:06 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 18:14:06 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 18:14:06 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575310444.26_4dc2b39c-59c1-44dd-8e8c-c812a503ff41', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 18:14:06 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575310444.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53973', 'job_port': u'0'}
19/12/02 18:14:06 INFO statecache.__init__: Creating state cache with size 0
19/12/02 18:14:06 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38915.
19/12/02 18:14:06 INFO sdk_worker.__init__: Control channel established.
19/12/02 18:14:06 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 18:14:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/02 18:14:06 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38099.
19/12/02 18:14:06 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 18:14:06 INFO data_plane.create_data_channel: Creating client data channel for localhost:45031
19/12/02 18:14:06 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 18:14:06 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 18:14:06 INFO sdk_worker.run: No more requests from control plane
19/12/02 18:14:06 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 18:14:06 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 18:14:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 18:14:06 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 18:14:06 INFO sdk_worker.run: Done consuming work.
19/12/02 18:14:06 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 18:14:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 18:14:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 18:14:07 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 18:14:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 18:14:07 INFO sdk_worker_main.main: Logging handler created.
19/12/02 18:14:07 INFO sdk_worker_main.start: Status HTTP server running at localhost:44829
19/12/02 18:14:07 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 18:14:07 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 18:14:07 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575310444.26_4dc2b39c-59c1-44dd-8e8c-c812a503ff41', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 18:14:07 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575310444.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53973', 'job_port': u'0'}
19/12/02 18:14:07 INFO statecache.__init__: Creating state cache with size 0
19/12/02 18:14:07 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36647.
19/12/02 18:14:07 INFO sdk_worker.__init__: Control channel established.
19/12/02 18:14:07 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 18:14:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/02 18:14:07 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43441.
19/12/02 18:14:07 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 18:14:07 INFO data_plane.create_data_channel: Creating client data channel for localhost:43043
19/12/02 18:14:07 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 18:14:07 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 18:14:07 INFO sdk_worker.run: No more requests from control plane
19/12/02 18:14:07 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 18:14:07 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 18:14:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 18:14:07 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 18:14:07 INFO sdk_worker.run: Done consuming work.
19/12/02 18:14:07 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 18:14:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 18:14:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 18:14:07 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 18:14:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 18:14:08 INFO sdk_worker_main.main: Logging handler created.
19/12/02 18:14:08 INFO sdk_worker_main.start: Status HTTP server running at localhost:35317
19/12/02 18:14:08 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 18:14:08 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 18:14:08 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575310444.26_4dc2b39c-59c1-44dd-8e8c-c812a503ff41', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 18:14:08 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575310444.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53973', 'job_port': u'0'}
19/12/02 18:14:08 INFO statecache.__init__: Creating state cache with size 0
19/12/02 18:14:08 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42263.
19/12/02 18:14:08 INFO sdk_worker.__init__: Control channel established.
19/12/02 18:14:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/02 18:14:08 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 18:14:08 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40303.
19/12/02 18:14:08 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 18:14:08 INFO data_plane.create_data_channel: Creating client data channel for localhost:43231
19/12/02 18:14:08 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 18:14:08 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 18:14:08 INFO sdk_worker.run: No more requests from control plane
19/12/02 18:14:08 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 18:14:08 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 18:14:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 18:14:08 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 18:14:08 INFO sdk_worker.run: Done consuming work.
19/12/02 18:14:08 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 18:14:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 18:14:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 18:14:08 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 18:14:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 18:14:09 INFO sdk_worker_main.main: Logging handler created.
19/12/02 18:14:09 INFO sdk_worker_main.start: Status HTTP server running at localhost:36029
19/12/02 18:14:09 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 18:14:09 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 18:14:09 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575310444.26_4dc2b39c-59c1-44dd-8e8c-c812a503ff41', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 18:14:09 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575310444.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53973', 'job_port': u'0'}
19/12/02 18:14:09 INFO statecache.__init__: Creating state cache with size 0
19/12/02 18:14:09 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41519.
19/12/02 18:14:09 INFO sdk_worker.__init__: Control channel established.
19/12/02 18:14:09 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 18:14:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/02 18:14:09 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38267.
19/12/02 18:14:09 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 18:14:09 INFO data_plane.create_data_channel: Creating client data channel for localhost:46233
19/12/02 18:14:09 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 18:14:09 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 18:14:09 INFO sdk_worker.run: No more requests from control plane
19/12/02 18:14:09 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 18:14:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 18:14:09 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 18:14:09 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 18:14:09 INFO sdk_worker.run: Done consuming work.
19/12/02 18:14:09 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 18:14:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 18:14:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 18:14:09 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575310444.26_4dc2b39c-59c1-44dd-8e8c-c812a503ff41 finished.
19/12/02 18:14:09 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/02 18:14:09 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_252fe288-178c-490a-beb1-a694f350c984","basePath":"/tmp/sparktest6W8AMZ"}: {}
java.io.FileNotFoundException: /tmp/sparktest6W8AMZ/job_252fe288-178c-490a-beb1-a694f350c984/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
==================== Timed out after 60 seconds. ====================
  File "/usr/lib/python2.7/threading.py", line 359, in wait

# Thread: <Thread(wait_until_finish_read, started daemon 140077219415808)>

    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-119, started daemon 140076734478080)>

======================================================================
# Thread: <_MainThread(MainThread, started 140078014588672)>
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575310435.37_3fb6ed47-5828-4d8b-beee-03b3d2f9939b failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 287.483s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 16s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/jfecorx56tfr6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1666

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1666/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/02 12:19:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 12:19:06 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 12:19:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 12:19:07 INFO sdk_worker_main.main: Logging handler created.
19/12/02 12:19:07 INFO sdk_worker_main.start: Status HTTP server running at localhost:39717
19/12/02 12:19:07 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 12:19:07 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 12:19:07 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575289144.78_5e824f44-7a7a-4970-bc99-2d727a3a3c55', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 12:19:07 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575289144.78', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46617', 'job_port': u'0'}
19/12/02 12:19:07 INFO statecache.__init__: Creating state cache with size 0
19/12/02 12:19:07 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39003.
19/12/02 12:19:07 INFO sdk_worker.__init__: Control channel established.
19/12/02 12:19:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/02 12:19:07 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 12:19:07 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35451.
19/12/02 12:19:07 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 12:19:07 INFO data_plane.create_data_channel: Creating client data channel for localhost:41007
19/12/02 12:19:07 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 12:19:07 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 12:19:07 INFO sdk_worker.run: No more requests from control plane
19/12/02 12:19:07 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 12:19:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 12:19:07 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 12:19:07 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 12:19:07 INFO sdk_worker.run: Done consuming work.
19/12/02 12:19:07 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 12:19:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 12:19:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 12:19:07 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 12:19:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 12:19:08 INFO sdk_worker_main.main: Logging handler created.
19/12/02 12:19:08 INFO sdk_worker_main.start: Status HTTP server running at localhost:34331
19/12/02 12:19:08 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 12:19:08 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 12:19:08 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575289144.78_5e824f44-7a7a-4970-bc99-2d727a3a3c55', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 12:19:08 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575289144.78', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46617', 'job_port': u'0'}
19/12/02 12:19:08 INFO statecache.__init__: Creating state cache with size 0
19/12/02 12:19:08 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46437.
19/12/02 12:19:08 INFO sdk_worker.__init__: Control channel established.
19/12/02 12:19:08 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 12:19:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/02 12:19:08 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34887.
19/12/02 12:19:08 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 12:19:08 INFO data_plane.create_data_channel: Creating client data channel for localhost:34529
19/12/02 12:19:08 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 12:19:08 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 12:19:08 INFO sdk_worker.run: No more requests from control plane
19/12/02 12:19:08 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 12:19:08 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 12:19:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 12:19:08 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 12:19:08 INFO sdk_worker.run: Done consuming work.
19/12/02 12:19:08 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 12:19:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 12:19:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 12:19:08 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 12:19:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 12:19:09 INFO sdk_worker_main.main: Logging handler created.
19/12/02 12:19:09 INFO sdk_worker_main.start: Status HTTP server running at localhost:35705
19/12/02 12:19:09 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 12:19:09 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 12:19:09 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575289144.78_5e824f44-7a7a-4970-bc99-2d727a3a3c55', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 12:19:09 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575289144.78', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46617', 'job_port': u'0'}
19/12/02 12:19:09 INFO statecache.__init__: Creating state cache with size 0
19/12/02 12:19:09 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43033.
19/12/02 12:19:09 INFO sdk_worker.__init__: Control channel established.
19/12/02 12:19:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/02 12:19:09 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 12:19:09 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35445.
19/12/02 12:19:09 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 12:19:09 INFO data_plane.create_data_channel: Creating client data channel for localhost:34099
19/12/02 12:19:09 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 12:19:09 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 12:19:09 INFO sdk_worker.run: No more requests from control plane
19/12/02 12:19:09 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 12:19:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 12:19:09 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 12:19:09 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 12:19:09 INFO sdk_worker.run: Done consuming work.
19/12/02 12:19:09 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 12:19:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 12:19:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 12:19:09 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 12:19:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 12:19:10 INFO sdk_worker_main.main: Logging handler created.
19/12/02 12:19:10 INFO sdk_worker_main.start: Status HTTP server running at localhost:36901
19/12/02 12:19:10 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 12:19:10 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 12:19:10 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575289144.78_5e824f44-7a7a-4970-bc99-2d727a3a3c55', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 12:19:10 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575289144.78', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46617', 'job_port': u'0'}
19/12/02 12:19:10 INFO statecache.__init__: Creating state cache with size 0
19/12/02 12:19:10 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42677.
19/12/02 12:19:10 INFO sdk_worker.__init__: Control channel established.
19/12/02 12:19:10 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 12:19:10 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/02 12:19:10 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36879.
19/12/02 12:19:10 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 12:19:10 INFO data_plane.create_data_channel: Creating client data channel for localhost:46811
19/12/02 12:19:10 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 12:19:10 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 12:19:10 INFO sdk_worker.run: No more requests from control plane
19/12/02 12:19:10 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 12:19:10 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 12:19:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 12:19:10 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 12:19:10 INFO sdk_worker.run: Done consuming work.
19/12/02 12:19:10 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 12:19:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 12:19:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 12:19:10 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575289144.78_5e824f44-7a7a-4970-bc99-2d727a3a3c55 finished.
19/12/02 12:19:10 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/02 12:19:10 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_f8f93c73-88df-4615-9bad-5b744ab55167","basePath":"/tmp/sparktestmQZUfn"}: {}
java.io.FileNotFoundException: /tmp/sparktestmQZUfn/job_f8f93c73-88df-4615-9bad-5b744ab55167/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
==================== Timed out after 60 seconds. ====================
----------------------------------------------------------------------

Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140444742702848)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-120, started daemon 140444759488256)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140445881960192)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140444725917440)>

# Thread: <Thread(Thread-124, started daemon 140444734310144)>

# Thread: <_MainThread(MainThread, started 140445881960192)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575289135.1_d1b63aed-344e-494c-8d90-17e859a01314 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 306.917s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 39s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/anjtpjlrjvdzy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1665

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1665/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/02 06:17:24 INFO sdk_worker_main.start: Status HTTP server running at localhost:41719
19/12/02 06:17:24 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 06:17:24 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 06:17:24 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575267442.11_a5cf41f0-f4da-4292-b4f4-f25c8fe90247', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 06:17:24 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575267442.11', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58661', 'job_port': u'0'}
19/12/02 06:17:24 INFO statecache.__init__: Creating state cache with size 0
19/12/02 06:17:24 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38767.
19/12/02 06:17:24 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/02 06:17:24 INFO sdk_worker.__init__: Control channel established.
19/12/02 06:17:24 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 06:17:24 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42597.
19/12/02 06:17:24 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 06:17:24 INFO data_plane.create_data_channel: Creating client data channel for localhost:34629
19/12/02 06:17:24 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 06:17:24 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 06:17:24 INFO sdk_worker.run: No more requests from control plane
19/12/02 06:17:24 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 06:17:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 06:17:24 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 06:17:24 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 06:17:24 INFO sdk_worker.run: Done consuming work.
19/12/02 06:17:24 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 06:17:24 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 06:17:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 06:17:25 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 06:17:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 06:17:25 INFO sdk_worker_main.main: Logging handler created.
19/12/02 06:17:25 INFO sdk_worker_main.start: Status HTTP server running at localhost:43851
19/12/02 06:17:25 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 06:17:25 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 06:17:25 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575267442.11_a5cf41f0-f4da-4292-b4f4-f25c8fe90247', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 06:17:25 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575267442.11', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58661', 'job_port': u'0'}
19/12/02 06:17:25 INFO statecache.__init__: Creating state cache with size 0
19/12/02 06:17:25 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42927.
19/12/02 06:17:25 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/02 06:17:25 INFO sdk_worker.__init__: Control channel established.
19/12/02 06:17:25 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 06:17:25 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41097.
19/12/02 06:17:25 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 06:17:25 INFO data_plane.create_data_channel: Creating client data channel for localhost:43859
19/12/02 06:17:25 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 06:17:25 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 06:17:25 INFO sdk_worker.run: No more requests from control plane
19/12/02 06:17:25 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 06:17:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 06:17:25 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 06:17:25 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 06:17:25 INFO sdk_worker.run: Done consuming work.
19/12/02 06:17:25 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 06:17:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 06:17:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 06:17:26 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 06:17:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 06:17:26 INFO sdk_worker_main.main: Logging handler created.
19/12/02 06:17:26 INFO sdk_worker_main.start: Status HTTP server running at localhost:36345
19/12/02 06:17:26 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 06:17:26 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 06:17:26 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575267442.11_a5cf41f0-f4da-4292-b4f4-f25c8fe90247', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 06:17:26 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575267442.11', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58661', 'job_port': u'0'}
19/12/02 06:17:26 INFO statecache.__init__: Creating state cache with size 0
19/12/02 06:17:26 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35421.
19/12/02 06:17:26 INFO sdk_worker.__init__: Control channel established.
19/12/02 06:17:26 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/02 06:17:26 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 06:17:26 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40135.
19/12/02 06:17:26 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 06:17:26 INFO data_plane.create_data_channel: Creating client data channel for localhost:46839
19/12/02 06:17:26 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 06:17:26 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 06:17:26 INFO sdk_worker.run: No more requests from control plane
19/12/02 06:17:26 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 06:17:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 06:17:26 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 06:17:26 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 06:17:26 INFO sdk_worker.run: Done consuming work.
19/12/02 06:17:26 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 06:17:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 06:17:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 06:17:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 06:17:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 06:17:27 INFO sdk_worker_main.main: Logging handler created.
19/12/02 06:17:27 INFO sdk_worker_main.start: Status HTTP server running at localhost:40109
19/12/02 06:17:27 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 06:17:27 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 06:17:27 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575267442.11_a5cf41f0-f4da-4292-b4f4-f25c8fe90247', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 06:17:27 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575267442.11', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58661', 'job_port': u'0'}
19/12/02 06:17:27 INFO statecache.__init__: Creating state cache with size 0
19/12/02 06:17:27 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34953.
19/12/02 06:17:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/02 06:17:27 INFO sdk_worker.__init__: Control channel established.
19/12/02 06:17:27 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 06:17:27 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42439.
19/12/02 06:17:27 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 06:17:27 INFO data_plane.create_data_channel: Creating client data channel for localhost:38581
19/12/02 06:17:27 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 06:17:27 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 06:17:27 INFO sdk_worker.run: No more requests from control plane
19/12/02 06:17:27 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 06:17:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 06:17:27 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 06:17:27 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 06:17:27 INFO sdk_worker.run: Done consuming work.
19/12/02 06:17:27 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 06:17:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 06:17:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 06:17:27 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575267442.11_a5cf41f0-f4da-4292-b4f4-f25c8fe90247 finished.
19/12/02 06:17:27 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/02 06:17:27 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_6e65d1a0-ce03-4b48-a0c3-305442a24622","basePath":"/tmp/sparktest3OMUrJ"}: {}
java.io.FileNotFoundException: /tmp/sparktest3OMUrJ/job_6e65d1a0-ce03-4b48-a0c3-305442a24622/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
==================== Timed out after 60 seconds. ====================
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()

# Thread: <Thread(wait_until_finish_read, started daemon 140651100137216)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-120, started daemon 140651108529920)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 140651887638272)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 140651073910528)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(Thread-126, started daemon 140651082565376)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <_MainThread(MainThread, started 140651887638272)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-120, started daemon 140651108529920)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140651100137216)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575267431.96_c05e4f36-efb4-49d7-ac00-e4c0dbcc17e4 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 319.185s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 16s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://scans.gradle.com/s/z33tgmiq5duhc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1664

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1664/display/redirect>

Changes:


------------------------------------------
[...truncated 1.31 MB...]
19/12/02 00:13:10 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575245589.18_a19de7f5-6f55-4ac5-a608-4c5045401934 on Spark master local
19/12/02 00:13:10 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/02 00:13:10 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575245589.18_a19de7f5-6f55-4ac5-a608-4c5045401934: Pipeline translated successfully. Computing outputs
19/12/02 00:13:10 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 00:13:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 00:13:10 INFO sdk_worker_main.main: Logging handler created.
19/12/02 00:13:10 INFO sdk_worker_main.start: Status HTTP server running at localhost:45553
19/12/02 00:13:10 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 00:13:10 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 00:13:10 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575245589.18_a19de7f5-6f55-4ac5-a608-4c5045401934', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 00:13:10 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575245589.18', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39877', 'job_port': u'0'}
19/12/02 00:13:10 INFO statecache.__init__: Creating state cache with size 0
19/12/02 00:13:10 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34887.
19/12/02 00:13:10 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/02 00:13:10 INFO sdk_worker.__init__: Control channel established.
19/12/02 00:13:10 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 00:13:10 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43647.
19/12/02 00:13:10 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 00:13:10 INFO data_plane.create_data_channel: Creating client data channel for localhost:43265
19/12/02 00:13:10 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 00:13:10 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 00:13:10 INFO sdk_worker.run: No more requests from control plane
19/12/02 00:13:10 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 00:13:10 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 00:13:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 00:13:10 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 00:13:10 INFO sdk_worker.run: Done consuming work.
19/12/02 00:13:10 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 00:13:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 00:13:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 00:13:11 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 00:13:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 00:13:11 INFO sdk_worker_main.main: Logging handler created.
19/12/02 00:13:11 INFO sdk_worker_main.start: Status HTTP server running at localhost:36639
19/12/02 00:13:11 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 00:13:11 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 00:13:11 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575245589.18_a19de7f5-6f55-4ac5-a608-4c5045401934', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 00:13:11 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575245589.18', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39877', 'job_port': u'0'}
19/12/02 00:13:11 INFO statecache.__init__: Creating state cache with size 0
19/12/02 00:13:11 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39109.
19/12/02 00:13:11 INFO sdk_worker.__init__: Control channel established.
19/12/02 00:13:11 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 00:13:11 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/02 00:13:11 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39153.
19/12/02 00:13:11 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 00:13:11 INFO data_plane.create_data_channel: Creating client data channel for localhost:35023
19/12/02 00:13:11 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 00:13:11 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 00:13:11 INFO sdk_worker.run: No more requests from control plane
19/12/02 00:13:11 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 00:13:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 00:13:11 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 00:13:11 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 00:13:11 INFO sdk_worker.run: Done consuming work.
19/12/02 00:13:11 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 00:13:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 00:13:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 00:13:11 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 00:13:12 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 00:13:12 INFO sdk_worker_main.main: Logging handler created.
19/12/02 00:13:12 INFO sdk_worker_main.start: Status HTTP server running at localhost:43743
19/12/02 00:13:12 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 00:13:12 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 00:13:12 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575245589.18_a19de7f5-6f55-4ac5-a608-4c5045401934', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 00:13:12 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575245589.18', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39877', 'job_port': u'0'}
19/12/02 00:13:12 INFO statecache.__init__: Creating state cache with size 0
19/12/02 00:13:12 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39141.
19/12/02 00:13:12 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/02 00:13:12 INFO sdk_worker.__init__: Control channel established.
19/12/02 00:13:12 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 00:13:12 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40089.
19/12/02 00:13:12 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 00:13:12 INFO data_plane.create_data_channel: Creating client data channel for localhost:44481
19/12/02 00:13:12 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 00:13:12 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 00:13:12 INFO sdk_worker.run: No more requests from control plane
19/12/02 00:13:12 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 00:13:12 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 00:13:12 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 00:13:12 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 00:13:12 INFO sdk_worker.run: Done consuming work.
19/12/02 00:13:12 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 00:13:12 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 00:13:12 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 00:13:12 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 00:13:13 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 00:13:13 INFO sdk_worker_main.main: Logging handler created.
19/12/02 00:13:13 INFO sdk_worker_main.start: Status HTTP server running at localhost:41215
19/12/02 00:13:13 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 00:13:13 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 00:13:13 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575245589.18_a19de7f5-6f55-4ac5-a608-4c5045401934', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 00:13:13 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575245589.18', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39877', 'job_port': u'0'}
19/12/02 00:13:13 INFO statecache.__init__: Creating state cache with size 0
19/12/02 00:13:13 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41823.
19/12/02 00:13:13 INFO sdk_worker.__init__: Control channel established.
19/12/02 00:13:13 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 00:13:13 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/02 00:13:13 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38499.
19/12/02 00:13:13 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 00:13:13 INFO data_plane.create_data_channel: Creating client data channel for localhost:45021
19/12/02 00:13:13 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 00:13:13 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 00:13:13 INFO sdk_worker.run: No more requests from control plane
19/12/02 00:13:13 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 00:13:13 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 00:13:13 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 00:13:13 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 00:13:13 INFO sdk_worker.run: Done consuming work.
19/12/02 00:13:13 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 00:13:13 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 00:13:13 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 00:13:13 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 00:13:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 00:13:14 INFO sdk_worker_main.main: Logging handler created.
19/12/02 00:13:14 INFO sdk_worker_main.start: Status HTTP server running at localhost:42905
19/12/02 00:13:14 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 00:13:14 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 00:13:14 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575245589.18_a19de7f5-6f55-4ac5-a608-4c5045401934', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 00:13:14 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575245589.18', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39877', 'job_port': u'0'}
19/12/02 00:13:14 INFO statecache.__init__: Creating state cache with size 0
19/12/02 00:13:14 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38583.
19/12/02 00:13:14 INFO sdk_worker.__init__: Control channel established.
19/12/02 00:13:14 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/02 00:13:14 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 00:13:14 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43555.
19/12/02 00:13:14 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 00:13:14 INFO data_plane.create_data_channel: Creating client data channel for localhost:38667
19/12/02 00:13:14 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 00:13:14 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 00:13:14 INFO sdk_worker.run: No more requests from control plane
19/12/02 00:13:14 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 00:13:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 00:13:14 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 00:13:14 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 00:13:14 INFO sdk_worker.run: Done consuming work.
19/12/02 00:13:14 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 00:13:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 00:13:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 00:13:14 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575245589.18_a19de7f5-6f55-4ac5-a608-4c5045401934 finished.
19/12/02 00:13:14 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/02 00:13:14 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_523424ac-49a3-409c-89d4-dc39c8986a5c","basePath":"/tmp/sparktestmn1d0_"}: {}
java.io.FileNotFoundException: /tmp/sparktestmn1d0_/job_523424ac-49a3-409c-89d4-dc39c8986a5c/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140645320652544)>

----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
# Thread: <Thread(Thread-118, started daemon 140645337437952)>

    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "ap# Thread: <_MainThread(MainThread, started 140646116546304)>
ache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575245580.14_7d10de80-86a8-4622-b70e-338ea53bb9cd failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 293.346s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 41s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/psiroieuwxsuy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1663

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1663/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/01 18:14:09 INFO sdk_worker_main.start: Status HTTP server running at localhost:46665
19/12/01 18:14:09 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 18:14:09 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 18:14:09 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575224047.07_d03acb74-235e-422c-924c-deda0bd67ed6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 18:14:09 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575224047.07', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52543', 'job_port': u'0'}
19/12/01 18:14:09 INFO statecache.__init__: Creating state cache with size 0
19/12/01 18:14:09 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36243.
19/12/01 18:14:09 INFO sdk_worker.__init__: Control channel established.
19/12/01 18:14:09 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 18:14:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/01 18:14:09 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44173.
19/12/01 18:14:09 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 18:14:09 INFO data_plane.create_data_channel: Creating client data channel for localhost:44623
19/12/01 18:14:09 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 18:14:09 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 18:14:09 INFO sdk_worker.run: No more requests from control plane
19/12/01 18:14:09 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 18:14:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 18:14:09 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 18:14:09 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 18:14:09 INFO sdk_worker.run: Done consuming work.
19/12/01 18:14:09 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 18:14:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 18:14:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 18:14:10 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/01 18:14:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/01 18:14:11 INFO sdk_worker_main.main: Logging handler created.
19/12/01 18:14:11 INFO sdk_worker_main.start: Status HTTP server running at localhost:41255
19/12/01 18:14:11 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 18:14:11 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 18:14:11 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575224047.07_d03acb74-235e-422c-924c-deda0bd67ed6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 18:14:11 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575224047.07', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52543', 'job_port': u'0'}
19/12/01 18:14:11 INFO statecache.__init__: Creating state cache with size 0
19/12/01 18:14:11 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40929.
19/12/01 18:14:11 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/01 18:14:11 INFO sdk_worker.__init__: Control channel established.
19/12/01 18:14:11 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 18:14:11 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37819.
19/12/01 18:14:11 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 18:14:11 INFO data_plane.create_data_channel: Creating client data channel for localhost:41279
19/12/01 18:14:11 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 18:14:11 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 18:14:11 INFO sdk_worker.run: No more requests from control plane
19/12/01 18:14:11 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 18:14:11 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 18:14:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 18:14:11 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 18:14:11 INFO sdk_worker.run: Done consuming work.
19/12/01 18:14:11 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 18:14:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 18:14:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 18:14:11 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/01 18:14:12 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/01 18:14:12 INFO sdk_worker_main.main: Logging handler created.
19/12/01 18:14:12 INFO sdk_worker_main.start: Status HTTP server running at localhost:44021
19/12/01 18:14:12 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 18:14:12 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 18:14:12 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575224047.07_d03acb74-235e-422c-924c-deda0bd67ed6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 18:14:12 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575224047.07', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52543', 'job_port': u'0'}
19/12/01 18:14:12 INFO statecache.__init__: Creating state cache with size 0
19/12/01 18:14:12 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42507.
19/12/01 18:14:12 INFO sdk_worker.__init__: Control channel established.
19/12/01 18:14:12 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/01 18:14:12 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 18:14:12 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43667.
19/12/01 18:14:12 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 18:14:12 INFO data_plane.create_data_channel: Creating client data channel for localhost:33717
19/12/01 18:14:12 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 18:14:12 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 18:14:12 INFO sdk_worker.run: No more requests from control plane
19/12/01 18:14:12 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 18:14:12 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 18:14:12 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 18:14:12 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 18:14:12 INFO sdk_worker.run: Done consuming work.
19/12/01 18:14:12 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 18:14:12 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 18:14:12 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 18:14:12 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/01 18:14:13 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/01 18:14:13 INFO sdk_worker_main.main: Logging handler created.
19/12/01 18:14:13 INFO sdk_worker_main.start: Status HTTP server running at localhost:41751
19/12/01 18:14:13 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 18:14:13 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 18:14:13 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575224047.07_d03acb74-235e-422c-924c-deda0bd67ed6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 18:14:13 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575224047.07', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52543', 'job_port': u'0'}
19/12/01 18:14:13 INFO statecache.__init__: Creating state cache with size 0
19/12/01 18:14:13 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44363.
19/12/01 18:14:13 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/01 18:14:13 INFO sdk_worker.__init__: Control channel established.
19/12/01 18:14:13 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 18:14:13 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45565.
19/12/01 18:14:13 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 18:14:13 INFO data_plane.create_data_channel: Creating client data channel for localhost:44899
19/12/01 18:14:13 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 18:14:13 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 18:14:13 INFO sdk_worker.run: No more requests from control plane
19/12/01 18:14:13 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 18:14:13 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 18:14:13 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 18:14:13 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 18:14:13 INFO sdk_worker.run: Done consuming work.
19/12/01 18:14:13 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 18:14:13 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 18:14:13 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 18:14:13 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575224047.07_d03acb74-235e-422c-924c-deda0bd67ed6 finished.
19/12/01 18:14:13 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/01 18:14:13 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_9119c505-116b-413e-a604-0e42f7bffe15","basePath":"/tmp/sparktestrisL3D"}: {}
java.io.FileNotFoundException: /tmp/sparktestrisL3D/job_9119c505-116b-413e-a604-0e42f7bffe15/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
==================== Timed out after 60 seconds. ====================
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)

# Thread: <Thread(wait_until_finish_read, started daemon 140272040601344)>

----------------------------------------------------------------------
# Thread: <Thread(Thread-118, started daemon 140271551510272)>

Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140272821880576)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
# Thread: <Thread(wait_until_finish_read, started daemon 140271534724864)>

    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-124, started daemon 140271543117568)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-118, started daemon 140271551510272)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <_MainThread(MainThread, started 140272821880576)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 140272040601344)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575224036.85_df39dfc2-1351-44aa-b4cc-3152bef3a1ae failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 363.548s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 52s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/5mw34r72dozze

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1662

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1662/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/01 12:13:53 INFO sdk_worker_main.start: Status HTTP server running at localhost:46707
19/12/01 12:13:53 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 12:13:53 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 12:13:53 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575202430.51_0a95af0d-f1e1-424a-82ce-af0c1978c806', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 12:13:53 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575202430.51', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57751', 'job_port': u'0'}
19/12/01 12:13:53 INFO statecache.__init__: Creating state cache with size 0
19/12/01 12:13:53 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36091.
19/12/01 12:13:53 INFO sdk_worker.__init__: Control channel established.
19/12/01 12:13:53 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/01 12:13:53 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 12:13:53 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36939.
19/12/01 12:13:53 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 12:13:53 INFO data_plane.create_data_channel: Creating client data channel for localhost:40289
19/12/01 12:13:53 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 12:13:53 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 12:13:53 INFO sdk_worker.run: No more requests from control plane
19/12/01 12:13:53 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 12:13:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 12:13:53 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 12:13:53 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 12:13:53 INFO sdk_worker.run: Done consuming work.
19/12/01 12:13:53 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 12:13:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 12:13:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 12:13:53 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/01 12:13:54 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/01 12:13:54 INFO sdk_worker_main.main: Logging handler created.
19/12/01 12:13:54 INFO sdk_worker_main.start: Status HTTP server running at localhost:37659
19/12/01 12:13:54 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 12:13:54 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 12:13:54 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575202430.51_0a95af0d-f1e1-424a-82ce-af0c1978c806', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 12:13:54 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575202430.51', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57751', 'job_port': u'0'}
19/12/01 12:13:54 INFO statecache.__init__: Creating state cache with size 0
19/12/01 12:13:54 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33469.
19/12/01 12:13:54 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/01 12:13:54 INFO sdk_worker.__init__: Control channel established.
19/12/01 12:13:54 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 12:13:54 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38017.
19/12/01 12:13:54 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 12:13:54 INFO data_plane.create_data_channel: Creating client data channel for localhost:42779
19/12/01 12:13:54 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 12:13:54 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 12:13:54 INFO sdk_worker.run: No more requests from control plane
19/12/01 12:13:54 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 12:13:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 12:13:54 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 12:13:54 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 12:13:54 INFO sdk_worker.run: Done consuming work.
19/12/01 12:13:54 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 12:13:54 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 12:13:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 12:13:54 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/01 12:13:55 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/01 12:13:55 INFO sdk_worker_main.main: Logging handler created.
19/12/01 12:13:55 INFO sdk_worker_main.start: Status HTTP server running at localhost:45869
19/12/01 12:13:55 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 12:13:55 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 12:13:55 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575202430.51_0a95af0d-f1e1-424a-82ce-af0c1978c806', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 12:13:55 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575202430.51', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57751', 'job_port': u'0'}
19/12/01 12:13:55 INFO statecache.__init__: Creating state cache with size 0
19/12/01 12:13:55 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44459.
19/12/01 12:13:55 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/01 12:13:55 INFO sdk_worker.__init__: Control channel established.
19/12/01 12:13:55 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 12:13:55 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46813.
19/12/01 12:13:55 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 12:13:55 INFO data_plane.create_data_channel: Creating client data channel for localhost:40735
19/12/01 12:13:55 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 12:13:55 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 12:13:55 INFO sdk_worker.run: No more requests from control plane
19/12/01 12:13:55 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 12:13:55 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 12:13:55 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 12:13:55 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 12:13:55 INFO sdk_worker.run: Done consuming work.
19/12/01 12:13:55 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 12:13:55 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 12:13:55 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 12:13:56 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/01 12:13:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/01 12:13:56 INFO sdk_worker_main.main: Logging handler created.
19/12/01 12:13:56 INFO sdk_worker_main.start: Status HTTP server running at localhost:38903
19/12/01 12:13:56 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 12:13:56 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 12:13:56 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575202430.51_0a95af0d-f1e1-424a-82ce-af0c1978c806', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 12:13:56 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575202430.51', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57751', 'job_port': u'0'}
19/12/01 12:13:56 INFO statecache.__init__: Creating state cache with size 0
19/12/01 12:13:56 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43969.
19/12/01 12:13:56 INFO sdk_worker.__init__: Control channel established.
19/12/01 12:13:56 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 12:13:56 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/01 12:13:56 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44521.
19/12/01 12:13:56 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 12:13:56 INFO data_plane.create_data_channel: Creating client data channel for localhost:40269
19/12/01 12:13:56 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 12:13:56 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 12:13:56 INFO sdk_worker.run: No more requests from control plane
19/12/01 12:13:56 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 12:13:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 12:13:56 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 12:13:56 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 12:13:56 INFO sdk_worker.run: Done consuming work.
19/12/01 12:13:56 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 12:13:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 12:13:57 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 12:13:57 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575202430.51_0a95af0d-f1e1-424a-82ce-af0c1978c806 finished.
19/12/01 12:13:57 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/01 12:13:57 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_ec3d552e-2f53-4045-ac2f-8ced9250d7d5","basePath":"/tmp/sparktestH370Sx"}: {}
java.io.FileNotFoundException: /tmp/sparktestH370Sx/job_ec3d552e-2f53-4045-ac2f-8ced9250d7d5/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
==================== Timed out after 60 seconds. ====================

BaseException: Timed out after 60 seconds.

======================================================================
# Thread: <Thread(wait_until_finish_read, started daemon 140613191739136)>

ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-119, started daemon 140613183346432)>

# Thread: <_MainThread(MainThread, started 140613970847488)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140613157119744)>

# Thread: <Thread(Thread-125, started daemon 140613165774592)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <_MainThread(MainThread, started 140613970847488)>

# Thread: <Thread(Thread-119, started daemon 140613183346432)>

# Thread: <Thread(wait_until_finish_read, started daemon 140613191739136)>
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575202417.02_c2504de6-3ed7-449a-9321-a85a8cb98407 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 359.553s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 0s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/widgowtoxhy5w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1661

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1661/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/01 06:15:06 INFO sdk_worker_main.start: Status HTTP server running at localhost:44161
19/12/01 06:15:06 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 06:15:06 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 06:15:06 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575180903.43_39ed56dd-132f-4ddb-8fa2-877a640281ba', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 06:15:06 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575180903.43', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41149', 'job_port': u'0'}
19/12/01 06:15:06 INFO statecache.__init__: Creating state cache with size 0
19/12/01 06:15:06 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45393.
19/12/01 06:15:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/01 06:15:06 INFO sdk_worker.__init__: Control channel established.
19/12/01 06:15:06 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 06:15:06 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45171.
19/12/01 06:15:06 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 06:15:06 INFO data_plane.create_data_channel: Creating client data channel for localhost:41081
19/12/01 06:15:06 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 06:15:06 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 06:15:06 INFO sdk_worker.run: No more requests from control plane
19/12/01 06:15:06 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 06:15:06 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 06:15:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 06:15:06 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 06:15:06 INFO sdk_worker.run: Done consuming work.
19/12/01 06:15:06 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 06:15:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 06:15:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 06:15:06 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/01 06:15:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/01 06:15:07 INFO sdk_worker_main.main: Logging handler created.
19/12/01 06:15:07 INFO sdk_worker_main.start: Status HTTP server running at localhost:42365
19/12/01 06:15:07 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 06:15:07 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 06:15:07 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575180903.43_39ed56dd-132f-4ddb-8fa2-877a640281ba', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 06:15:07 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575180903.43', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41149', 'job_port': u'0'}
19/12/01 06:15:07 INFO statecache.__init__: Creating state cache with size 0
19/12/01 06:15:07 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42973.
19/12/01 06:15:07 INFO sdk_worker.__init__: Control channel established.
19/12/01 06:15:07 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 06:15:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/01 06:15:07 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34961.
19/12/01 06:15:07 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 06:15:07 INFO data_plane.create_data_channel: Creating client data channel for localhost:45213
19/12/01 06:15:07 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 06:15:07 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 06:15:07 INFO sdk_worker.run: No more requests from control plane
19/12/01 06:15:07 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 06:15:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 06:15:07 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 06:15:07 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 06:15:07 INFO sdk_worker.run: Done consuming work.
19/12/01 06:15:07 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 06:15:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 06:15:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 06:15:07 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/01 06:15:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/01 06:15:08 INFO sdk_worker_main.main: Logging handler created.
19/12/01 06:15:08 INFO sdk_worker_main.start: Status HTTP server running at localhost:33693
19/12/01 06:15:08 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 06:15:08 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 06:15:08 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575180903.43_39ed56dd-132f-4ddb-8fa2-877a640281ba', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 06:15:08 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575180903.43', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41149', 'job_port': u'0'}
19/12/01 06:15:08 INFO statecache.__init__: Creating state cache with size 0
19/12/01 06:15:08 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43601.
19/12/01 06:15:08 INFO sdk_worker.__init__: Control channel established.
19/12/01 06:15:08 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 06:15:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/01 06:15:08 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36295.
19/12/01 06:15:08 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 06:15:08 INFO data_plane.create_data_channel: Creating client data channel for localhost:35417
19/12/01 06:15:08 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 06:15:08 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 06:15:08 INFO sdk_worker.run: No more requests from control plane
19/12/01 06:15:08 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 06:15:08 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 06:15:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 06:15:08 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 06:15:08 INFO sdk_worker.run: Done consuming work.
19/12/01 06:15:08 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 06:15:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 06:15:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 06:15:08 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/01 06:15:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/01 06:15:09 INFO sdk_worker_main.main: Logging handler created.
19/12/01 06:15:09 INFO sdk_worker_main.start: Status HTTP server running at localhost:38179
19/12/01 06:15:09 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 06:15:09 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 06:15:09 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575180903.43_39ed56dd-132f-4ddb-8fa2-877a640281ba', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 06:15:09 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575180903.43', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41149', 'job_port': u'0'}
19/12/01 06:15:09 INFO statecache.__init__: Creating state cache with size 0
19/12/01 06:15:09 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40855.
19/12/01 06:15:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/01 06:15:09 INFO sdk_worker.__init__: Control channel established.
19/12/01 06:15:09 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 06:15:09 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46451.
19/12/01 06:15:09 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 06:15:09 INFO data_plane.create_data_channel: Creating client data channel for localhost:33645
19/12/01 06:15:09 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 06:15:09 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 06:15:09 INFO sdk_worker.run: No more requests from control plane
19/12/01 06:15:09 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 06:15:09 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 06:15:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 06:15:09 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 06:15:09 INFO sdk_worker.run: Done consuming work.
19/12/01 06:15:09 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 06:15:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 06:15:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 06:15:09 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575180903.43_39ed56dd-132f-4ddb-8fa2-877a640281ba finished.
19/12/01 06:15:09 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/01 06:15:09 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_f422cdc0-36ca-4a3f-b490-a9f35c3af59a","basePath":"/tmp/sparktestLcRVOs"}: {}
java.io.FileNotFoundException: /tmp/sparktestLcRVOs/job_f422cdc0-36ca-4a3f-b490-a9f35c3af59a/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140464365328128)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)

# Thread: <Thread(Thread-118, started daemon 140464373720832)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 140465152829184)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(wait_until_finish_read, started daemon 140464271382272)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(Thread-124, started daemon 140464279774976)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
# Thread: <Thread(Thread-118, started daemon 140464373720832)>

BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140465152829184)>

======================================================================
# Thread: <Thread(wait_until_finish_read, started daemon 140464365328128)>
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575180892.6_b1ddb27a-2a34-40db-a13a-f5d4d66ec74a failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 338.481s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 39s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/fsgxyjrls2aia

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1660

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1660/display/redirect>

Changes:


------------------------------------------
[...truncated 1.31 MB...]
19/12/01 00:16:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575159376.84_4487da24-21a5-4203-9bc0-ea21c5cfa0e8 on Spark master local
19/12/01 00:16:17 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/01 00:16:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575159376.84_4487da24-21a5-4203-9bc0-ea21c5cfa0e8: Pipeline translated successfully. Computing outputs
19/12/01 00:16:17 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/01 00:16:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/01 00:16:18 INFO sdk_worker_main.main: Logging handler created.
19/12/01 00:16:18 INFO sdk_worker_main.start: Status HTTP server running at localhost:45899
19/12/01 00:16:18 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 00:16:18 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 00:16:18 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575159376.84_4487da24-21a5-4203-9bc0-ea21c5cfa0e8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 00:16:18 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575159376.84', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44733', 'job_port': u'0'}
19/12/01 00:16:18 INFO statecache.__init__: Creating state cache with size 0
19/12/01 00:16:18 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46547.
19/12/01 00:16:18 INFO sdk_worker.__init__: Control channel established.
19/12/01 00:16:18 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 00:16:18 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/01 00:16:18 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36457.
19/12/01 00:16:18 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 00:16:18 INFO data_plane.create_data_channel: Creating client data channel for localhost:37683
19/12/01 00:16:18 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 00:16:18 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 00:16:18 INFO sdk_worker.run: No more requests from control plane
19/12/01 00:16:18 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 00:16:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 00:16:18 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 00:16:18 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 00:16:18 INFO sdk_worker.run: Done consuming work.
19/12/01 00:16:18 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 00:16:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 00:16:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 00:16:18 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/01 00:16:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/01 00:16:19 INFO sdk_worker_main.main: Logging handler created.
19/12/01 00:16:19 INFO sdk_worker_main.start: Status HTTP server running at localhost:40521
19/12/01 00:16:19 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 00:16:19 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 00:16:19 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575159376.84_4487da24-21a5-4203-9bc0-ea21c5cfa0e8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 00:16:19 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575159376.84', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44733', 'job_port': u'0'}
19/12/01 00:16:19 INFO statecache.__init__: Creating state cache with size 0
19/12/01 00:16:19 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43705.
19/12/01 00:16:19 INFO sdk_worker.__init__: Control channel established.
19/12/01 00:16:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/01 00:16:19 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 00:16:19 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38251.
19/12/01 00:16:19 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 00:16:19 INFO data_plane.create_data_channel: Creating client data channel for localhost:34325
19/12/01 00:16:19 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 00:16:19 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 00:16:19 INFO sdk_worker.run: No more requests from control plane
19/12/01 00:16:19 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 00:16:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 00:16:19 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 00:16:19 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 00:16:19 INFO sdk_worker.run: Done consuming work.
19/12/01 00:16:19 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 00:16:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 00:16:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 00:16:19 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/01 00:16:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/01 00:16:20 INFO sdk_worker_main.main: Logging handler created.
19/12/01 00:16:20 INFO sdk_worker_main.start: Status HTTP server running at localhost:38959
19/12/01 00:16:20 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 00:16:20 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 00:16:20 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575159376.84_4487da24-21a5-4203-9bc0-ea21c5cfa0e8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 00:16:20 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575159376.84', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44733', 'job_port': u'0'}
19/12/01 00:16:20 INFO statecache.__init__: Creating state cache with size 0
19/12/01 00:16:20 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36723.
19/12/01 00:16:20 INFO sdk_worker.__init__: Control channel established.
19/12/01 00:16:20 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 00:16:20 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/01 00:16:20 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34523.
19/12/01 00:16:20 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 00:16:20 INFO data_plane.create_data_channel: Creating client data channel for localhost:43515
19/12/01 00:16:20 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 00:16:20 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 00:16:20 INFO sdk_worker.run: No more requests from control plane
19/12/01 00:16:20 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 00:16:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 00:16:20 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 00:16:20 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 00:16:20 INFO sdk_worker.run: Done consuming work.
19/12/01 00:16:20 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 00:16:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 00:16:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 00:16:20 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/01 00:16:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/01 00:16:20 INFO sdk_worker_main.main: Logging handler created.
19/12/01 00:16:20 INFO sdk_worker_main.start: Status HTTP server running at localhost:34003
19/12/01 00:16:20 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 00:16:20 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 00:16:20 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575159376.84_4487da24-21a5-4203-9bc0-ea21c5cfa0e8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 00:16:20 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575159376.84', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44733', 'job_port': u'0'}
19/12/01 00:16:20 INFO statecache.__init__: Creating state cache with size 0
19/12/01 00:16:20 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37411.
19/12/01 00:16:20 INFO sdk_worker.__init__: Control channel established.
19/12/01 00:16:20 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/01 00:16:20 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 00:16:20 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40947.
19/12/01 00:16:20 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 00:16:20 INFO data_plane.create_data_channel: Creating client data channel for localhost:39605
19/12/01 00:16:20 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 00:16:20 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 00:16:20 INFO sdk_worker.run: No more requests from control plane
19/12/01 00:16:20 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 00:16:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 00:16:20 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 00:16:20 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 00:16:20 INFO sdk_worker.run: Done consuming work.
19/12/01 00:16:20 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 00:16:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 00:16:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 00:16:21 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/01 00:16:21 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/01 00:16:21 INFO sdk_worker_main.main: Logging handler created.
19/12/01 00:16:21 INFO sdk_worker_main.start: Status HTTP server running at localhost:38603
19/12/01 00:16:21 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 00:16:21 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 00:16:21 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575159376.84_4487da24-21a5-4203-9bc0-ea21c5cfa0e8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 00:16:21 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575159376.84', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44733', 'job_port': u'0'}
19/12/01 00:16:21 INFO statecache.__init__: Creating state cache with size 0
19/12/01 00:16:21 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39541.
19/12/01 00:16:21 INFO sdk_worker.__init__: Control channel established.
19/12/01 00:16:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/01 00:16:21 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 00:16:21 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42945.
19/12/01 00:16:21 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 00:16:21 INFO data_plane.create_data_channel: Creating client data channel for localhost:39745
19/12/01 00:16:21 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 00:16:21 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 00:16:21 INFO sdk_worker.run: No more requests from control plane
19/12/01 00:16:21 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 00:16:21 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 00:16:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 00:16:21 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 00:16:21 INFO sdk_worker.run: Done consuming work.
19/12/01 00:16:21 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 00:16:21 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 00:16:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 00:16:21 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575159376.84_4487da24-21a5-4203-9bc0-ea21c5cfa0e8 finished.
19/12/01 00:16:21 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/01 00:16:21 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_b4f3d69f-4a39-4144-b371-18770b2f92f7","basePath":"/tmp/sparktestqn2xiM"}: {}
java.io.FileNotFoundException: /tmp/sparktestqn2xiM/job_b4f3d69f-4a39-4144-b371-18770b2f92f7/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
==================== Timed out after 60 seconds. ====================
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__

# Thread: <Thread(wait_until_finish_read, started daemon 140119541540608)>

    self.run().wait_until_finish()
  File "ap# Thread: <Thread(Thread-118, started daemon 140119533147904)>
ache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575159368.3_d83954d4-a21b-4bbf-9dbd-2fe0eae8d139 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <_MainThread(MainThread, started 140120391169792)>
----------------------------------------------------------------------
Ran 38 tests in 266.073s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 9s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/jrbtxjbv53o34

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1659

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1659/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-7 (beam) in workspace <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/>
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 1f64ba3aeb093c77c4d931fb6791b8b239be3f85 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 1f64ba3aeb093c77c4d931fb6791b8b239be3f85
Commit message: "Merge pull request #10105: [BEAM-4776] Add metrics support to Java PortableRunner"
 > git rev-list --no-walk 1f64ba3aeb093c77c4d931fb6791b8b239be3f85 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/gradlew> --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :sdks:python:test-suites:portable:py2:sparkValidatesRunner
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=a797e98f-ad75-422a-a42c-f30b1842b92f, currentDir=<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 12933
  log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-12933.out.log
----- Last  20 lines from daemon log file - daemon-12933.out.log -----
18:05:13.801 [DEBUG] [org.gradle.launcher.daemon.server.DefaultDaemonConnection] thread 114: Received non-IO message from client: Build{id=a797e98f-ad75-422a-a42c-f30b1842b92f, currentDir=<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src}>
18:05:13.802 [INFO] [org.gradle.launcher.daemon.server.DefaultIncomingConnectionHandler] Received command: Build{id=a797e98f-ad75-422a-a42c-f30b1842b92f, currentDir=<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src}.>
18:05:13.802 [DEBUG] [org.gradle.launcher.daemon.server.DefaultIncomingConnectionHandler] Starting executing command: Build{id=a797e98f-ad75-422a-a42c-f30b1842b92f, currentDir=<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src}> with connection: socket connection from /0:0:0:0:0:0:0:1:46033 to /0:0:0:0:0:0:0:1:54266.
18:05:13.802 [ERROR] [org.gradle.launcher.daemon.server.DaemonStateCoordinator] Command execution: started DaemonCommandExecution[command = Build{id=a797e98f-ad75-422a-a42c-f30b1842b92f, currentDir=<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src},> connection = DefaultDaemonConnection: socket connection from /0:0:0:0:0:0:0:1:46033 to /0:0:0:0:0:0:0:1:54266] after 0.0 minutes of idle
18:05:13.802 [INFO] [org.gradle.launcher.daemon.server.DaemonRegistryUpdater] Marking the daemon as busy, address: [724d7790-1aa3-4ed1-8189-3232e24e40ba port:46033, addresses:[/0:0:0:0:0:0:0:1%lo, /127.0.0.1]]
18:05:13.802 [DEBUG] [org.gradle.launcher.daemon.registry.PersistentDaemonRegistry] Marking busy by address: [724d7790-1aa3-4ed1-8189-3232e24e40ba port:46033, addresses:[/0:0:0:0:0:0:0:1%lo, /127.0.0.1]]
18:05:13.802 [DEBUG] [org.gradle.cache.internal.DefaultFileLockManager] Waiting to acquire exclusive lock on daemon addresses registry.
18:05:13.803 [DEBUG] [org.gradle.cache.internal.DefaultFileLockManager] Lock acquired on daemon addresses registry.
18:05:13.803 [DEBUG] [org.gradle.cache.internal.DefaultFileLockManager] Releasing lock on daemon addresses registry.
18:05:13.803 [DEBUG] [org.gradle.launcher.daemon.server.DaemonStateCoordinator] resetting idle timer
18:05:13.803 [DEBUG] [org.gradle.launcher.daemon.server.DaemonStateCoordinator] daemon is running. Sleeping until state changes.
18:05:13.803 [INFO] [org.gradle.launcher.daemon.server.exec.StartBuildOrRespondWithBusy] Daemon is about to start building Build{id=a797e98f-ad75-422a-a42c-f30b1842b92f, currentDir=<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src}.> Dispatching build started information...
18:05:13.803 [DEBUG] [org.gradle.launcher.daemon.server.SynchronizedDispatchConnection] thread 26: dispatching class org.gradle.launcher.daemon.protocol.BuildStarted
18:05:13.804 [DEBUG] [org.gradle.launcher.daemon.server.exec.EstablishBuildEnvironment] Configuring env variables: {PATH=/home/jenkins/tools/java/latest1.8/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games, RUN_DISPLAY_URL=https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1659/display/redirect, HUDSON_HOME=/x1/jenkins/jenkins-home, RUN_CHANGES_DISPLAY_URL=https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1659/display/redirect?page=changes, JOB_URL=https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/, HUDSON_COOKIE=1c398942-becf-455b-9e59-da903e7e45f9, NIX_LABEL=ubuntu, SLACK_WEBHOOK_URL=****, SUDO_USER=yifanzou, MAIL=/var/mail/jenkins, WIN_LABEL=Windows, JENKINS_SERVER_COOKIE=f4ebd1e6b0d976e8, USERNAME=root, LOGNAME=jenkins, PWD=<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src,> JENKINS_URL=https://builds.apache.org/, SHELL=/bin/bash, BUILD_TAG=jenkins-beam_PostCommit_Python_VR_Spark-1659, GIT_AUTHOR_EMAIL=builds@apache.org, LESSOPEN=| /usr/bin/lesspipe %s, ROOT_BUILD_CAUSE=TIMERTRIGGER, BUILD_CAUSE_TIMERTRIGGER=true, GIT_AUTHOR_NAME=jenkins, OLDPWD=<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src,> JENKINS_HOME=/x1/jenkins/jenkins-home, sha1=master, NODE_NAME=apache-beam-jenkins-7, BUILD_DISPLAY_NAME=#1659, JOB_DISPLAY_URL=https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/display/redirect, GIT_BRANCH=origin/master, LS_COLORS=rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:mi=00:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arc=01;31:*.arj=01;31:*.taz=01;31:*.lha=01;31:*.lz4=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.tzo=01;31:*.t7z=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lrz=01;31:*.lz=01;31:*.lzo=01;31:*.xz=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.alz=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.cab=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=00;36:*.au=00;36:*.flac=00;36:*.m4a=00;36:*.mid=00;36:*.midi=00;36:*.mka=00;36:*.mp3=00;36:*.mpc=00;36:*.ogg=00;36:*.ra=00;36:*.wav=00;36:*.oga=00;36:*.opus=00;36:*.spx=00;36:*.xspf=00;36:, SHLVL=1, GIT_PREVIOUS_COMMIT=1f64ba3aeb093c77c4d931fb6791b8b239be3f85, LESSCLOSE=/usr/bin/lesspipe %s %s, JAVA_HOME=/home/jenkins/tools/java/latest1.8, TERM=xterm-256color, BUILD_ID=1659, LANG=en_US.UTF-8, JOB_NAME=beam_PostCommit_Python_VR_Spark, SPARK_LOCAL_IP=127.0.0.1, BUILD_CAUSE=TIMERTRIGGER, SUDO_GID=1014, GIT_PREVIOUS_SUCCESSFUL_COMMIT=fc77c31fe7fa99425862de688251a09ded57bf54, NODE_LABELS=apache-beam-jenkins-7 beam, HUDSON_URL=https://builds.apache.org/, WORKSPACE=<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/,> ROOT_BUILD_CAUSE_TIMERTRIGGER=true, SUDO_UID=1013, _=/usr/bin/nohup, GIT_COMMIT=1f64ba3aeb093c77c4d931fb6791b8b239be3f85, COVERALLS_REPO_TOKEN=****, EXECUTOR_NUMBER=1, HUDSON_SERVER_COOKIE=f4ebd1e6b0d976e8, GIT_COMMITTER_NAME=jenkins, JOB_BASE_NAME=beam_PostCommit_Python_VR_Spark, USER=jenkins, SUDO_COMMAND=/bin/su jenkins, BUILD_NUMBER=1659, BUILD_URL=https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1659/, GIT_COMMITTER_EMAIL=builds@apache.org, GIT_URL=https://github.com/apache/beam.git, HOME=/home/jenkins}
18:05:13.805 [DEBUG] [org.gradle.launcher.daemon.server.exec.LogToClient] About to start relaying all logs to the client via the connection.
18:05:13.805 [INFO] [org.gradle.launcher.daemon.server.exec.LogToClient] The client will now receive all logging from the daemon (pid: 12933). The daemon log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-12933.out.log
18:05:13.805 [INFO] [org.gradle.launcher.daemon.server.exec.LogAndCheckHealth] Starting 2nd build in daemon [uptime: 4 mins 55.395 secs, performance: 100%]
18:05:13.806 [DEBUG] [org.gradle.launcher.daemon.server.exec.ExecuteBuild] The daemon has started executing the build.
18:05:13.806 [DEBUG] [org.gradle.launcher.daemon.server.exec.ExecuteBuild] Executing build with daemon context: DefaultDaemonContext[uid=ff25c547-99d5-42b9-9f17-59fef341331e,javaHome=/usr/lib/jvm/java-8-openjdk-amd64,daemonRegistryDir=/home/jenkins/.gradle/daemon,pid=12933,idleTimeout=10800000,priority=NORMAL,daemonOpts=-Xmx4g,-Dfile.encoding=UTF-8,-Duser.country=US,-Duser.language=en,-Duser.variant]
Daemon vm is shutting down... The daemon has exited normally or was terminated in response to a user interrupt.
----- End of the daemon log -----


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may have crashed)

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1658

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1658/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/11/30 12:14:32 INFO sdk_worker_main.start: Status HTTP server running at localhost:35437
19/11/30 12:14:32 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/30 12:14:32 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/30 12:14:32 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575116069.86_3cd1cbfa-7915-4c83-aa7b-2c6f436fef16', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/30 12:14:32 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575116069.86', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52465', 'job_port': u'0'}
19/11/30 12:14:32 INFO statecache.__init__: Creating state cache with size 0
19/11/30 12:14:32 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36037.
19/11/30 12:14:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/30 12:14:32 INFO sdk_worker.__init__: Control channel established.
19/11/30 12:14:32 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/30 12:14:32 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43925.
19/11/30 12:14:32 INFO sdk_worker.create_state_handler: State channel established.
19/11/30 12:14:32 INFO data_plane.create_data_channel: Creating client data channel for localhost:42373
19/11/30 12:14:32 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/30 12:14:32 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/30 12:14:32 INFO sdk_worker.run: No more requests from control plane
19/11/30 12:14:32 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/30 12:14:32 INFO data_plane.close: Closing all cached grpc data channels.
19/11/30 12:14:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 12:14:32 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/30 12:14:32 INFO sdk_worker.run: Done consuming work.
19/11/30 12:14:32 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/30 12:14:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/30 12:14:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 12:14:33 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/30 12:14:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/30 12:14:33 INFO sdk_worker_main.main: Logging handler created.
19/11/30 12:14:33 INFO sdk_worker_main.start: Status HTTP server running at localhost:39827
19/11/30 12:14:33 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/30 12:14:33 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/30 12:14:33 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575116069.86_3cd1cbfa-7915-4c83-aa7b-2c6f436fef16', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/30 12:14:33 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575116069.86', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52465', 'job_port': u'0'}
19/11/30 12:14:33 INFO statecache.__init__: Creating state cache with size 0
19/11/30 12:14:33 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35191.
19/11/30 12:14:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/30 12:14:33 INFO sdk_worker.__init__: Control channel established.
19/11/30 12:14:33 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/30 12:14:34 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:32903.
19/11/30 12:14:34 INFO sdk_worker.create_state_handler: State channel established.
19/11/30 12:14:34 INFO data_plane.create_data_channel: Creating client data channel for localhost:36261
19/11/30 12:14:34 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/30 12:14:34 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/30 12:14:34 INFO sdk_worker.run: No more requests from control plane
19/11/30 12:14:34 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/30 12:14:34 INFO data_plane.close: Closing all cached grpc data channels.
19/11/30 12:14:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 12:14:34 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/30 12:14:34 INFO sdk_worker.run: Done consuming work.
19/11/30 12:14:34 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/30 12:14:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/30 12:14:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 12:14:34 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/30 12:14:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/30 12:14:34 INFO sdk_worker_main.main: Logging handler created.
19/11/30 12:14:34 INFO sdk_worker_main.start: Status HTTP server running at localhost:41055
19/11/30 12:14:34 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/30 12:14:34 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/30 12:14:35 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575116069.86_3cd1cbfa-7915-4c83-aa7b-2c6f436fef16', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/30 12:14:35 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575116069.86', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52465', 'job_port': u'0'}
19/11/30 12:14:35 INFO statecache.__init__: Creating state cache with size 0
19/11/30 12:14:35 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33361.
19/11/30 12:14:35 INFO sdk_worker.__init__: Control channel established.
19/11/30 12:14:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/30 12:14:35 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/30 12:14:35 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44789.
19/11/30 12:14:35 INFO sdk_worker.create_state_handler: State channel established.
19/11/30 12:14:35 INFO data_plane.create_data_channel: Creating client data channel for localhost:33401
19/11/30 12:14:35 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/30 12:14:35 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/30 12:14:35 INFO sdk_worker.run: No more requests from control plane
19/11/30 12:14:35 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/30 12:14:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 12:14:35 INFO data_plane.close: Closing all cached grpc data channels.
19/11/30 12:14:35 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/30 12:14:35 INFO sdk_worker.run: Done consuming work.
19/11/30 12:14:35 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/30 12:14:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/30 12:14:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 12:14:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/30 12:14:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/30 12:14:36 INFO sdk_worker_main.main: Logging handler created.
19/11/30 12:14:36 INFO sdk_worker_main.start: Status HTTP server running at localhost:37463
19/11/30 12:14:36 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/30 12:14:36 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/30 12:14:36 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575116069.86_3cd1cbfa-7915-4c83-aa7b-2c6f436fef16', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/30 12:14:36 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575116069.86', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52465', 'job_port': u'0'}
19/11/30 12:14:36 INFO statecache.__init__: Creating state cache with size 0
19/11/30 12:14:36 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39681.
19/11/30 12:14:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/30 12:14:36 INFO sdk_worker.__init__: Control channel established.
19/11/30 12:14:36 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/30 12:14:36 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38379.
19/11/30 12:14:36 INFO sdk_worker.create_state_handler: State channel established.
19/11/30 12:14:36 INFO data_plane.create_data_channel: Creating client data channel for localhost:36309
19/11/30 12:14:36 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/30 12:14:36 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/30 12:14:36 INFO sdk_worker.run: No more requests from control plane
19/11/30 12:14:36 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/30 12:14:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 12:14:36 INFO data_plane.close: Closing all cached grpc data channels.
19/11/30 12:14:36 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/30 12:14:36 INFO sdk_worker.run: Done consuming work.
19/11/30 12:14:36 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/30 12:14:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/30 12:14:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 12:14:36 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575116069.86_3cd1cbfa-7915-4c83-aa7b-2c6f436fef16 finished.
19/11/30 12:14:36 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/30 12:14:36 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_9221b2c7-6b45-4164-b4db-a41bc751aa71","basePath":"/tmp/sparktestBOKZU6"}: {}
java.io.FileNotFoundException: /tmp/sparktestBOKZU6/job_9221b2c7-6b45-4164-b4db-a41bc751aa71/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 139711570622208)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-117, started daemon 139711553836800)>

# Thread: <_MainThread(MainThread, started 139712357693184)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139711537051392)>

# Thread: <Thread(Thread-123, started daemon 139711545444096)>

# Thread: <_MainThread(MainThread, started 139712357693184)>

# Thread: <Thread(Thread-117, started daemon 139711553836800)>

# Thread: <Thread(wait_until_finish_read, started daemon 139711570622208)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575116057.53_71ca086a-5c44-40c8-a56f-f6dc227b021c failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 390.704s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 53s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/tlkgrr3zr3c3o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1657

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1657/display/redirect>

Changes:


------------------------------------------
[...truncated 1.31 MB...]
19/11/30 06:13:44 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575094423.21_b7416250-51ef-463d-ab15-359481c8eb96 on Spark master local
19/11/30 06:13:44 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/11/30 06:13:44 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575094423.21_b7416250-51ef-463d-ab15-359481c8eb96: Pipeline translated successfully. Computing outputs
19/11/30 06:13:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/30 06:13:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/30 06:13:44 INFO sdk_worker_main.main: Logging handler created.
19/11/30 06:13:44 INFO sdk_worker_main.start: Status HTTP server running at localhost:44379
19/11/30 06:13:44 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/30 06:13:44 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/30 06:13:44 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575094423.21_b7416250-51ef-463d-ab15-359481c8eb96', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/30 06:13:44 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575094423.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55161', 'job_port': u'0'}
19/11/30 06:13:44 INFO statecache.__init__: Creating state cache with size 0
19/11/30 06:13:44 INFO sdk_worker.__init__: Creating insecure control channel for localhost:32967.
19/11/30 06:13:44 INFO sdk_worker.__init__: Control channel established.
19/11/30 06:13:44 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/30 06:13:44 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/11/30 06:13:44 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35435.
19/11/30 06:13:44 INFO sdk_worker.create_state_handler: State channel established.
19/11/30 06:13:44 INFO data_plane.create_data_channel: Creating client data channel for localhost:36853
19/11/30 06:13:44 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/30 06:13:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/30 06:13:45 INFO sdk_worker.run: No more requests from control plane
19/11/30 06:13:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/30 06:13:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 06:13:45 INFO data_plane.close: Closing all cached grpc data channels.
19/11/30 06:13:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/30 06:13:45 INFO sdk_worker.run: Done consuming work.
19/11/30 06:13:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/30 06:13:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/30 06:13:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 06:13:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/30 06:13:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/30 06:13:45 INFO sdk_worker_main.main: Logging handler created.
19/11/30 06:13:45 INFO sdk_worker_main.start: Status HTTP server running at localhost:35519
19/11/30 06:13:45 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/30 06:13:45 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/30 06:13:45 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575094423.21_b7416250-51ef-463d-ab15-359481c8eb96', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/30 06:13:45 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575094423.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55161', 'job_port': u'0'}
19/11/30 06:13:45 INFO statecache.__init__: Creating state cache with size 0
19/11/30 06:13:45 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35925.
19/11/30 06:13:45 INFO sdk_worker.__init__: Control channel established.
19/11/30 06:13:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/30 06:13:45 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/30 06:13:45 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43781.
19/11/30 06:13:45 INFO sdk_worker.create_state_handler: State channel established.
19/11/30 06:13:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:43017
19/11/30 06:13:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/30 06:13:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/30 06:13:45 INFO sdk_worker.run: No more requests from control plane
19/11/30 06:13:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/30 06:13:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 06:13:45 INFO data_plane.close: Closing all cached grpc data channels.
19/11/30 06:13:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/30 06:13:45 INFO sdk_worker.run: Done consuming work.
19/11/30 06:13:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/30 06:13:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/30 06:13:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 06:13:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/30 06:13:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/30 06:13:46 INFO sdk_worker_main.main: Logging handler created.
19/11/30 06:13:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:45115
19/11/30 06:13:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/30 06:13:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/30 06:13:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575094423.21_b7416250-51ef-463d-ab15-359481c8eb96', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/30 06:13:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575094423.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55161', 'job_port': u'0'}
19/11/30 06:13:46 INFO statecache.__init__: Creating state cache with size 0
19/11/30 06:13:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46439.
19/11/30 06:13:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/30 06:13:46 INFO sdk_worker.__init__: Control channel established.
19/11/30 06:13:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/30 06:13:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33719.
19/11/30 06:13:46 INFO sdk_worker.create_state_handler: State channel established.
19/11/30 06:13:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:35047
19/11/30 06:13:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/30 06:13:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/30 06:13:46 INFO sdk_worker.run: No more requests from control plane
19/11/30 06:13:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/30 06:13:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 06:13:46 INFO data_plane.close: Closing all cached grpc data channels.
19/11/30 06:13:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/30 06:13:46 INFO sdk_worker.run: Done consuming work.
19/11/30 06:13:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/30 06:13:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/30 06:13:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 06:13:46 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/30 06:13:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/30 06:13:47 INFO sdk_worker_main.main: Logging handler created.
19/11/30 06:13:47 INFO sdk_worker_main.start: Status HTTP server running at localhost:46371
19/11/30 06:13:47 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/30 06:13:47 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/30 06:13:47 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575094423.21_b7416250-51ef-463d-ab15-359481c8eb96', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/30 06:13:47 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575094423.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55161', 'job_port': u'0'}
19/11/30 06:13:47 INFO statecache.__init__: Creating state cache with size 0
19/11/30 06:13:47 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43573.
19/11/30 06:13:47 INFO sdk_worker.__init__: Control channel established.
19/11/30 06:13:47 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/30 06:13:47 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/30 06:13:47 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35163.
19/11/30 06:13:47 INFO sdk_worker.create_state_handler: State channel established.
19/11/30 06:13:47 INFO data_plane.create_data_channel: Creating client data channel for localhost:40869
19/11/30 06:13:47 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/30 06:13:47 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/30 06:13:47 INFO sdk_worker.run: No more requests from control plane
19/11/30 06:13:47 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/30 06:13:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 06:13:47 INFO data_plane.close: Closing all cached grpc data channels.
19/11/30 06:13:47 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/30 06:13:47 INFO sdk_worker.run: Done consuming work.
19/11/30 06:13:47 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/30 06:13:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/30 06:13:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 06:13:47 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/30 06:13:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/30 06:13:48 INFO sdk_worker_main.main: Logging handler created.
19/11/30 06:13:48 INFO sdk_worker_main.start: Status HTTP server running at localhost:46235
19/11/30 06:13:48 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/30 06:13:48 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/30 06:13:48 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575094423.21_b7416250-51ef-463d-ab15-359481c8eb96', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/30 06:13:48 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575094423.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55161', 'job_port': u'0'}
19/11/30 06:13:48 INFO statecache.__init__: Creating state cache with size 0
19/11/30 06:13:48 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40823.
19/11/30 06:13:48 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/30 06:13:48 INFO sdk_worker.__init__: Control channel established.
19/11/30 06:13:48 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/30 06:13:48 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33777.
19/11/30 06:13:48 INFO sdk_worker.create_state_handler: State channel established.
19/11/30 06:13:48 INFO data_plane.create_data_channel: Creating client data channel for localhost:46709
19/11/30 06:13:48 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/30 06:13:48 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/30 06:13:48 INFO sdk_worker.run: No more requests from control plane
19/11/30 06:13:48 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/30 06:13:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 06:13:48 INFO data_plane.close: Closing all cached grpc data channels.
19/11/30 06:13:48 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/30 06:13:48 INFO sdk_worker.run: Done consuming work.
19/11/30 06:13:48 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/30 06:13:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/30 06:13:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 06:13:48 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575094423.21_b7416250-51ef-463d-ab15-359481c8eb96 finished.
19/11/30 06:13:48 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/30 06:13:48 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_bb19bb58-dba2-4d67-a447-c69e8cc3bab0","basePath":"/tmp/sparktest5nkpoo"}: {}
java.io.FileNotFoundException: /tmp/sparktest5nkpoo/job_bb19bb58-dba2-4d67-a447-c69e8cc3bab0/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139990183896832)>

  File "ap# Thread: <Thread(Thread-119, started daemon 139990175504128)>

ache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <_MainThread(MainThread, started 139990979360512)>
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575094414.53_7de8c860-0f3b-4ceb-b8c1-efcb0017e56f failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 290.919s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 27s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/ldwkadgw26evy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1656

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1656/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/11/30 00:14:48 INFO sdk_worker_main.start: Status HTTP server running at localhost:36521
19/11/30 00:14:48 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/30 00:14:48 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/30 00:14:48 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575072885.57_58a7d6ba-405a-4636-94bf-9bb8da9f0a3e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/30 00:14:48 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575072885.57', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37285', 'job_port': u'0'}
19/11/30 00:14:48 INFO statecache.__init__: Creating state cache with size 0
19/11/30 00:14:48 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36389.
19/11/30 00:14:48 INFO sdk_worker.__init__: Control channel established.
19/11/30 00:14:48 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/30 00:14:48 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/30 00:14:48 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39833.
19/11/30 00:14:48 INFO sdk_worker.create_state_handler: State channel established.
19/11/30 00:14:48 INFO data_plane.create_data_channel: Creating client data channel for localhost:46531
19/11/30 00:14:48 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/30 00:14:48 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/30 00:14:48 INFO sdk_worker.run: No more requests from control plane
19/11/30 00:14:48 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/30 00:14:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 00:14:48 INFO data_plane.close: Closing all cached grpc data channels.
19/11/30 00:14:48 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/30 00:14:48 INFO sdk_worker.run: Done consuming work.
19/11/30 00:14:48 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/30 00:14:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/30 00:14:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 00:14:48 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/30 00:14:49 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/30 00:14:49 INFO sdk_worker_main.main: Logging handler created.
19/11/30 00:14:49 INFO sdk_worker_main.start: Status HTTP server running at localhost:44247
19/11/30 00:14:49 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/30 00:14:49 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/30 00:14:49 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575072885.57_58a7d6ba-405a-4636-94bf-9bb8da9f0a3e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/30 00:14:49 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575072885.57', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37285', 'job_port': u'0'}
19/11/30 00:14:49 INFO statecache.__init__: Creating state cache with size 0
19/11/30 00:14:49 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35123.
19/11/30 00:14:49 INFO sdk_worker.__init__: Control channel established.
19/11/30 00:14:49 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/30 00:14:49 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/30 00:14:49 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44495.
19/11/30 00:14:49 INFO sdk_worker.create_state_handler: State channel established.
19/11/30 00:14:49 INFO data_plane.create_data_channel: Creating client data channel for localhost:39499
19/11/30 00:14:49 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/30 00:14:49 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/30 00:14:49 INFO sdk_worker.run: No more requests from control plane
19/11/30 00:14:49 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/30 00:14:49 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 00:14:49 INFO data_plane.close: Closing all cached grpc data channels.
19/11/30 00:14:49 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/30 00:14:49 INFO sdk_worker.run: Done consuming work.
19/11/30 00:14:49 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/30 00:14:49 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/30 00:14:50 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 00:14:50 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/30 00:14:50 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/30 00:14:50 INFO sdk_worker_main.main: Logging handler created.
19/11/30 00:14:50 INFO sdk_worker_main.start: Status HTTP server running at localhost:39669
19/11/30 00:14:50 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/30 00:14:50 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/30 00:14:50 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575072885.57_58a7d6ba-405a-4636-94bf-9bb8da9f0a3e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/30 00:14:50 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575072885.57', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37285', 'job_port': u'0'}
19/11/30 00:14:50 INFO statecache.__init__: Creating state cache with size 0
19/11/30 00:14:50 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40087.
19/11/30 00:14:50 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/30 00:14:50 INFO sdk_worker.__init__: Control channel established.
19/11/30 00:14:50 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/30 00:14:50 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40339.
19/11/30 00:14:50 INFO sdk_worker.create_state_handler: State channel established.
19/11/30 00:14:50 INFO data_plane.create_data_channel: Creating client data channel for localhost:42267
19/11/30 00:14:50 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/30 00:14:50 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/30 00:14:50 INFO sdk_worker.run: No more requests from control plane
19/11/30 00:14:50 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/30 00:14:50 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 00:14:50 INFO data_plane.close: Closing all cached grpc data channels.
19/11/30 00:14:50 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/30 00:14:50 INFO sdk_worker.run: Done consuming work.
19/11/30 00:14:50 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/30 00:14:50 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/30 00:14:50 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 00:14:50 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/30 00:14:51 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/30 00:14:51 INFO sdk_worker_main.main: Logging handler created.
19/11/30 00:14:51 INFO sdk_worker_main.start: Status HTTP server running at localhost:34937
19/11/30 00:14:51 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/30 00:14:51 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/30 00:14:51 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575072885.57_58a7d6ba-405a-4636-94bf-9bb8da9f0a3e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/30 00:14:51 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575072885.57', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37285', 'job_port': u'0'}
19/11/30 00:14:51 INFO statecache.__init__: Creating state cache with size 0
19/11/30 00:14:51 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44903.
19/11/30 00:14:51 INFO sdk_worker.__init__: Control channel established.
19/11/30 00:14:51 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/30 00:14:51 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/30 00:14:51 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44183.
19/11/30 00:14:51 INFO sdk_worker.create_state_handler: State channel established.
19/11/30 00:14:51 INFO data_plane.create_data_channel: Creating client data channel for localhost:36255
19/11/30 00:14:51 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/30 00:14:51 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/30 00:14:51 INFO sdk_worker.run: No more requests from control plane
19/11/30 00:14:51 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/30 00:14:51 INFO data_plane.close: Closing all cached grpc data channels.
19/11/30 00:14:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 00:14:51 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/30 00:14:51 INFO sdk_worker.run: Done consuming work.
19/11/30 00:14:51 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/30 00:14:51 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/30 00:14:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 00:14:51 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575072885.57_58a7d6ba-405a-4636-94bf-9bb8da9f0a3e finished.
19/11/30 00:14:51 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/30 00:14:51 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_48db0018-07ef-4b59-982a-8f3db09921c6","basePath":"/tmp/sparktestSxlOWN"}: {}
java.io.FileNotFoundException: /tmp/sparktestSxlOWN/job_48db0018-07ef-4b59-982a-8f3db09921c6/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 139853024847616)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-118, started daemon 139853008062208)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <_MainThread(MainThread, started 139853804586752)>
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
==================== Timed out after 60 seconds. ====================

BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(wait_until_finish_read, started daemon 139852918023936)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575072874.84_228de8fb-7cdf-4d8d-a149-6e844e40a68d failed in state FAILED: java.lang.UnsupportedOperationException: The A# Thread: <Thread(Thread-123, started daemon 139852926416640)>

ctiveBundle does not have a registered bundle checkpoint handler.

# Thread: <Thread(Thread-118, started daemon 139853008062208)>

----------------------------------------------------------------------
Ran 38 tests in 346.700s
# Thread: <Thread(wait_until_finish_read, started daemon 139853024847616)>


FAILED (errors=3, skipped=9)
# Thread: <_MainThread(MainThread, started 139853804586752)>

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 3s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/ljuieyrac5ssg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1655

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1655/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/11/29 18:14:22 INFO sdk_worker_main.start: Status HTTP server running at localhost:38003
19/11/29 18:14:22 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 18:14:22 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 18:14:22 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575051259.83_a6b2b250-ccf9-42f3-b596-f47b9aaddda0', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 18:14:22 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575051259.83', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55047', 'job_port': u'0'}
19/11/29 18:14:22 INFO statecache.__init__: Creating state cache with size 0
19/11/29 18:14:22 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43597.
19/11/29 18:14:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/29 18:14:22 INFO sdk_worker.__init__: Control channel established.
19/11/29 18:14:22 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 18:14:22 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45301.
19/11/29 18:14:22 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 18:14:22 INFO data_plane.create_data_channel: Creating client data channel for localhost:33895
19/11/29 18:14:22 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 18:14:22 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 18:14:22 INFO sdk_worker.run: No more requests from control plane
19/11/29 18:14:22 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 18:14:22 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 18:14:22 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 18:14:22 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 18:14:22 INFO sdk_worker.run: Done consuming work.
19/11/29 18:14:22 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 18:14:22 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 18:14:22 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 18:14:22 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 18:14:23 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 18:14:23 INFO sdk_worker_main.main: Logging handler created.
19/11/29 18:14:23 INFO sdk_worker_main.start: Status HTTP server running at localhost:44011
19/11/29 18:14:23 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 18:14:23 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 18:14:23 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575051259.83_a6b2b250-ccf9-42f3-b596-f47b9aaddda0', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 18:14:23 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575051259.83', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55047', 'job_port': u'0'}
19/11/29 18:14:23 INFO statecache.__init__: Creating state cache with size 0
19/11/29 18:14:23 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45871.
19/11/29 18:14:23 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/29 18:14:23 INFO sdk_worker.__init__: Control channel established.
19/11/29 18:14:23 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 18:14:23 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37195.
19/11/29 18:14:23 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 18:14:23 INFO data_plane.create_data_channel: Creating client data channel for localhost:41657
19/11/29 18:14:23 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 18:14:23 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 18:14:23 INFO sdk_worker.run: No more requests from control plane
19/11/29 18:14:23 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 18:14:23 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 18:14:23 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 18:14:23 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 18:14:23 INFO sdk_worker.run: Done consuming work.
19/11/29 18:14:23 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 18:14:23 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 18:14:23 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 18:14:23 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 18:14:24 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 18:14:24 INFO sdk_worker_main.main: Logging handler created.
19/11/29 18:14:24 INFO sdk_worker_main.start: Status HTTP server running at localhost:38091
19/11/29 18:14:24 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 18:14:24 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 18:14:24 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575051259.83_a6b2b250-ccf9-42f3-b596-f47b9aaddda0', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 18:14:24 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575051259.83', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55047', 'job_port': u'0'}
19/11/29 18:14:24 INFO statecache.__init__: Creating state cache with size 0
19/11/29 18:14:24 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46499.
19/11/29 18:14:24 INFO sdk_worker.__init__: Control channel established.
19/11/29 18:14:24 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/29 18:14:24 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 18:14:24 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44909.
19/11/29 18:14:24 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 18:14:24 INFO data_plane.create_data_channel: Creating client data channel for localhost:35809
19/11/29 18:14:24 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 18:14:24 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 18:14:24 INFO sdk_worker.run: No more requests from control plane
19/11/29 18:14:24 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 18:14:24 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 18:14:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 18:14:24 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 18:14:24 INFO sdk_worker.run: Done consuming work.
19/11/29 18:14:24 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 18:14:24 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 18:14:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 18:14:24 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 18:14:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 18:14:25 INFO sdk_worker_main.main: Logging handler created.
19/11/29 18:14:25 INFO sdk_worker_main.start: Status HTTP server running at localhost:35301
19/11/29 18:14:25 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 18:14:25 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 18:14:25 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575051259.83_a6b2b250-ccf9-42f3-b596-f47b9aaddda0', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 18:14:25 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575051259.83', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55047', 'job_port': u'0'}
19/11/29 18:14:25 INFO statecache.__init__: Creating state cache with size 0
19/11/29 18:14:25 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43685.
19/11/29 18:14:25 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/29 18:14:25 INFO sdk_worker.__init__: Control channel established.
19/11/29 18:14:25 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 18:14:25 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43791.
19/11/29 18:14:25 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 18:14:25 INFO data_plane.create_data_channel: Creating client data channel for localhost:40761
19/11/29 18:14:25 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 18:14:25 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 18:14:25 INFO sdk_worker.run: No more requests from control plane
19/11/29 18:14:25 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 18:14:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 18:14:25 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 18:14:25 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 18:14:25 INFO sdk_worker.run: Done consuming work.
19/11/29 18:14:25 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 18:14:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 18:14:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 18:14:25 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575051259.83_a6b2b250-ccf9-42f3-b596-f47b9aaddda0 finished.
19/11/29 18:14:25 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/29 18:14:25 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_b1b2af48-4685-432e-8479-5d19945bde69","basePath":"/tmp/sparktest15mcnz"}: {}
java.io.FileNotFoundException: /tmp/sparktest15mcnz/job_b1b2af48-4685-432e-8479-5d19945bde69/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
==================== Timed out after 60 seconds. ====================
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 140310801803008)>


# Thread: <Thread(Thread-117, started daemon 140310785017600)>

# Thread: <_MainThread(MainThread, started 140311654516480)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140310768232192)>

# Thread: <Thread(Thread-123, started daemon 140310776624896)>

# Thread: <Thread(Thread-117, started daemon 140310785017600)>

# Thread: <Thread(wait_until_finish_read, started daemon 140310801803008)>

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <_MainThread(MainThread, started 140311654516480)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575051249.94_4caff44e-cd3d-4568-991e-29abb3e23b24 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 310.058s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 22s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/dndm7ra5cuakq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1654

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1654/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/11/29 12:12:37 INFO sdk_worker_main.start: Status HTTP server running at localhost:42615
19/11/29 12:12:37 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 12:12:37 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 12:12:37 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575029554.43_b56895fb-7feb-4107-8f0b-1aa0b3b6c877', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 12:12:37 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575029554.43', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48583', 'job_port': u'0'}
19/11/29 12:12:37 INFO statecache.__init__: Creating state cache with size 0
19/11/29 12:12:37 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45481.
19/11/29 12:12:37 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/29 12:12:37 INFO sdk_worker.__init__: Control channel established.
19/11/29 12:12:37 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 12:12:37 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33027.
19/11/29 12:12:37 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 12:12:37 INFO data_plane.create_data_channel: Creating client data channel for localhost:32901
19/11/29 12:12:37 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 12:12:37 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 12:12:37 INFO sdk_worker.run: No more requests from control plane
19/11/29 12:12:37 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 12:12:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 12:12:37 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 12:12:37 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 12:12:37 INFO sdk_worker.run: Done consuming work.
19/11/29 12:12:37 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 12:12:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 12:12:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 12:12:37 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 12:12:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 12:12:38 INFO sdk_worker_main.main: Logging handler created.
19/11/29 12:12:38 INFO sdk_worker_main.start: Status HTTP server running at localhost:37781
19/11/29 12:12:38 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 12:12:38 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 12:12:38 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575029554.43_b56895fb-7feb-4107-8f0b-1aa0b3b6c877', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 12:12:38 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575029554.43', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48583', 'job_port': u'0'}
19/11/29 12:12:38 INFO statecache.__init__: Creating state cache with size 0
19/11/29 12:12:38 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42357.
19/11/29 12:12:38 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/29 12:12:38 INFO sdk_worker.__init__: Control channel established.
19/11/29 12:12:38 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 12:12:38 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43987.
19/11/29 12:12:38 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 12:12:38 INFO data_plane.create_data_channel: Creating client data channel for localhost:39295
19/11/29 12:12:38 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 12:12:38 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 12:12:38 INFO sdk_worker.run: No more requests from control plane
19/11/29 12:12:38 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 12:12:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 12:12:38 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 12:12:38 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 12:12:38 INFO sdk_worker.run: Done consuming work.
19/11/29 12:12:38 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 12:12:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 12:12:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 12:12:38 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 12:12:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 12:12:39 INFO sdk_worker_main.main: Logging handler created.
19/11/29 12:12:39 INFO sdk_worker_main.start: Status HTTP server running at localhost:35499
19/11/29 12:12:39 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 12:12:39 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 12:12:39 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575029554.43_b56895fb-7feb-4107-8f0b-1aa0b3b6c877', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 12:12:39 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575029554.43', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48583', 'job_port': u'0'}
19/11/29 12:12:39 INFO statecache.__init__: Creating state cache with size 0
19/11/29 12:12:39 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36363.
19/11/29 12:12:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/29 12:12:39 INFO sdk_worker.__init__: Control channel established.
19/11/29 12:12:39 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 12:12:39 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33791.
19/11/29 12:12:39 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 12:12:39 INFO data_plane.create_data_channel: Creating client data channel for localhost:43577
19/11/29 12:12:39 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 12:12:39 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 12:12:39 INFO sdk_worker.run: No more requests from control plane
19/11/29 12:12:39 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 12:12:39 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 12:12:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 12:12:39 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 12:12:39 INFO sdk_worker.run: Done consuming work.
19/11/29 12:12:39 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 12:12:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 12:12:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 12:12:39 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 12:12:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 12:12:40 INFO sdk_worker_main.main: Logging handler created.
19/11/29 12:12:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:43873
19/11/29 12:12:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 12:12:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 12:12:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575029554.43_b56895fb-7feb-4107-8f0b-1aa0b3b6c877', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 12:12:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575029554.43', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48583', 'job_port': u'0'}
19/11/29 12:12:40 INFO statecache.__init__: Creating state cache with size 0
19/11/29 12:12:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35203.
19/11/29 12:12:40 INFO sdk_worker.__init__: Control channel established.
19/11/29 12:12:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/29 12:12:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 12:12:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41033.
19/11/29 12:12:40 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 12:12:40 INFO data_plane.create_data_channel: Creating client data channel for localhost:38461
19/11/29 12:12:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 12:12:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 12:12:40 INFO sdk_worker.run: No more requests from control plane
19/11/29 12:12:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 12:12:40 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 12:12:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 12:12:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 12:12:40 INFO sdk_worker.run: Done consuming work.
19/11/29 12:12:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 12:12:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 12:12:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 12:12:40 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575029554.43_b56895fb-7feb-4107-8f0b-1aa0b3b6c877 finished.
19/11/29 12:12:40 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/29 12:12:40 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_d8cf24be-967d-4d0a-98a7-551048e51290","basePath":"/tmp/sparktesteS8EBL"}: {}
java.io.FileNotFoundException: /tmp/sparktesteS8EBL/job_d8cf24be-967d-4d0a-98a7-551048e51290/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139668226037504)>

# Thread: <Thread(Thread-117, started daemon 139668242822912)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
# Thread: <_MainThread(MainThread, started 139669361047296)>
==================== Timed out after 60 seconds. ====================
    _common.wait(self._state.condition.wait, _response_ready)

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 139668209252096)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(Thread-123, started daemon 139668217644800)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <_MainThread(MainThread, started 139669361047296)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
# Thread: <Thread(Thread-117, started daemon 139668242822912)>

BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139668226037504)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575029543.72_f2ed65e8-5f92-4c62-b1f1-05bf65105e97 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 355.012s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 7s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/vrzmgbbhc4q5k

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1653

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1653/display/redirect?page=changes>

Changes:

[michal.walenia] [BEAM-4776] Add metrics support to Java PortableRunner


------------------------------------------
[...truncated 1.32 MB...]
19/11/29 10:50:08 INFO sdk_worker_main.start: Status HTTP server running at localhost:42463
19/11/29 10:50:08 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 10:50:08 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 10:50:08 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575024605.76_33164bde-7f4b-4610-ad8a-a3da2692fca6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 10:50:08 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575024605.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56299', 'job_port': u'0'}
19/11/29 10:50:08 INFO statecache.__init__: Creating state cache with size 0
19/11/29 10:50:08 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43531.
19/11/29 10:50:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/29 10:50:08 INFO sdk_worker.__init__: Control channel established.
19/11/29 10:50:08 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 10:50:08 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37503.
19/11/29 10:50:08 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 10:50:08 INFO data_plane.create_data_channel: Creating client data channel for localhost:35437
19/11/29 10:50:08 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 10:50:08 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 10:50:08 INFO sdk_worker.run: No more requests from control plane
19/11/29 10:50:08 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 10:50:08 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 10:50:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 10:50:08 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 10:50:08 INFO sdk_worker.run: Done consuming work.
19/11/29 10:50:08 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 10:50:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 10:50:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 10:50:08 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 10:50:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 10:50:09 INFO sdk_worker_main.main: Logging handler created.
19/11/29 10:50:09 INFO sdk_worker_main.start: Status HTTP server running at localhost:40289
19/11/29 10:50:09 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 10:50:09 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 10:50:09 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575024605.76_33164bde-7f4b-4610-ad8a-a3da2692fca6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 10:50:09 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575024605.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56299', 'job_port': u'0'}
19/11/29 10:50:09 INFO statecache.__init__: Creating state cache with size 0
19/11/29 10:50:09 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36399.
19/11/29 10:50:09 INFO sdk_worker.__init__: Control channel established.
19/11/29 10:50:09 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 10:50:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/29 10:50:09 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36603.
19/11/29 10:50:09 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 10:50:09 INFO data_plane.create_data_channel: Creating client data channel for localhost:35241
19/11/29 10:50:09 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 10:50:09 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 10:50:09 INFO sdk_worker.run: No more requests from control plane
19/11/29 10:50:09 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 10:50:09 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 10:50:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 10:50:09 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 10:50:09 INFO sdk_worker.run: Done consuming work.
19/11/29 10:50:09 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 10:50:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 10:50:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 10:50:09 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 10:50:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 10:50:10 INFO sdk_worker_main.main: Logging handler created.
19/11/29 10:50:10 INFO sdk_worker_main.start: Status HTTP server running at localhost:34789
19/11/29 10:50:10 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 10:50:10 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 10:50:10 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575024605.76_33164bde-7f4b-4610-ad8a-a3da2692fca6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 10:50:10 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575024605.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56299', 'job_port': u'0'}
19/11/29 10:50:10 INFO statecache.__init__: Creating state cache with size 0
19/11/29 10:50:10 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45001.
19/11/29 10:50:10 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/29 10:50:10 INFO sdk_worker.__init__: Control channel established.
19/11/29 10:50:10 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 10:50:10 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34823.
19/11/29 10:50:10 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 10:50:10 INFO data_plane.create_data_channel: Creating client data channel for localhost:40903
19/11/29 10:50:10 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 10:50:10 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 10:50:10 INFO sdk_worker.run: No more requests from control plane
19/11/29 10:50:10 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 10:50:10 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 10:50:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 10:50:10 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 10:50:10 INFO sdk_worker.run: Done consuming work.
19/11/29 10:50:10 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 10:50:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 10:50:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 10:50:10 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 10:50:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 10:50:11 INFO sdk_worker_main.main: Logging handler created.
19/11/29 10:50:11 INFO sdk_worker_main.start: Status HTTP server running at localhost:33847
19/11/29 10:50:11 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 10:50:11 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 10:50:11 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575024605.76_33164bde-7f4b-4610-ad8a-a3da2692fca6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 10:50:11 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575024605.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56299', 'job_port': u'0'}
19/11/29 10:50:11 INFO statecache.__init__: Creating state cache with size 0
19/11/29 10:50:11 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44901.
19/11/29 10:50:11 INFO sdk_worker.__init__: Control channel established.
19/11/29 10:50:11 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/29 10:50:11 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 10:50:11 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36833.
19/11/29 10:50:11 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 10:50:11 INFO data_plane.create_data_channel: Creating client data channel for localhost:42143
19/11/29 10:50:11 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 10:50:11 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 10:50:11 INFO sdk_worker.run: No more requests from control plane
19/11/29 10:50:11 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 10:50:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 10:50:11 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 10:50:11 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 10:50:11 INFO sdk_worker.run: Done consuming work.
19/11/29 10:50:11 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 10:50:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 10:50:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 10:50:11 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575024605.76_33164bde-7f4b-4610-ad8a-a3da2692fca6 finished.
19/11/29 10:50:11 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/29 10:50:11 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_d6ae00cf-7dd2-4474-85b7-0d60cdcf768a","basePath":"/tmp/sparktestZcj4xW"}: {}
java.io.FileNotFoundException: /tmp/sparktestZcj4xW/job_d6ae00cf-7dd2-4474-85b7-0d60cdcf768a/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140121965852416)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-116, started daemon 140122450085632)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <_MainThread(MainThread, started 140123236865792)>
==================== Timed out after 60 seconds. ====================

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 140121949067008)>

# Thread: <Thread(Thread-122, started daemon 140121957459712)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(lis# Thread: <Thread(Thread-116, started daemon 140122450085632)>

t(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575024595.75_fc4af90c-6c4b-432a-b0bc-121618f770fb failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 374.762s

FAILED (errors=3, skipped=9)
# Thread: <_MainThread(MainThread, started 140123236865792)>

# Thread: <Thread(wait_until_finish_read, started daemon 140121965852416)>

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 34s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/hthqaapepoflo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1652

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1652/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/11/29 06:11:25 INFO sdk_worker_main.start: Status HTTP server running at localhost:42821
19/11/29 06:11:25 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 06:11:25 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 06:11:25 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575007882.93_4ede7c62-9a82-402b-8731-afff60c28df3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 06:11:25 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575007882.93', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41345', 'job_port': u'0'}
19/11/29 06:11:25 INFO statecache.__init__: Creating state cache with size 0
19/11/29 06:11:25 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38501.
19/11/29 06:11:25 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/29 06:11:25 INFO sdk_worker.__init__: Control channel established.
19/11/29 06:11:25 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 06:11:25 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46805.
19/11/29 06:11:25 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 06:11:25 INFO data_plane.create_data_channel: Creating client data channel for localhost:37533
19/11/29 06:11:25 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 06:11:25 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 06:11:25 INFO sdk_worker.run: No more requests from control plane
19/11/29 06:11:25 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 06:11:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 06:11:25 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 06:11:25 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 06:11:25 INFO sdk_worker.run: Done consuming work.
19/11/29 06:11:25 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 06:11:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 06:11:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 06:11:25 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 06:11:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 06:11:26 INFO sdk_worker_main.main: Logging handler created.
19/11/29 06:11:26 INFO sdk_worker_main.start: Status HTTP server running at localhost:39925
19/11/29 06:11:26 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 06:11:26 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 06:11:26 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575007882.93_4ede7c62-9a82-402b-8731-afff60c28df3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 06:11:26 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575007882.93', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41345', 'job_port': u'0'}
19/11/29 06:11:26 INFO statecache.__init__: Creating state cache with size 0
19/11/29 06:11:26 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37647.
19/11/29 06:11:26 INFO sdk_worker.__init__: Control channel established.
19/11/29 06:11:26 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 06:11:26 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/29 06:11:26 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34887.
19/11/29 06:11:26 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 06:11:26 INFO data_plane.create_data_channel: Creating client data channel for localhost:38961
19/11/29 06:11:26 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 06:11:26 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 06:11:26 INFO sdk_worker.run: No more requests from control plane
19/11/29 06:11:26 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 06:11:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 06:11:26 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 06:11:26 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 06:11:26 INFO sdk_worker.run: Done consuming work.
19/11/29 06:11:26 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 06:11:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 06:11:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 06:11:26 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 06:11:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 06:11:26 INFO sdk_worker_main.main: Logging handler created.
19/11/29 06:11:26 INFO sdk_worker_main.start: Status HTTP server running at localhost:44403
19/11/29 06:11:26 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 06:11:26 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 06:11:26 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575007882.93_4ede7c62-9a82-402b-8731-afff60c28df3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 06:11:26 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575007882.93', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41345', 'job_port': u'0'}
19/11/29 06:11:26 INFO statecache.__init__: Creating state cache with size 0
19/11/29 06:11:26 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38545.
19/11/29 06:11:26 INFO sdk_worker.__init__: Control channel established.
19/11/29 06:11:26 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 06:11:26 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/29 06:11:26 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39013.
19/11/29 06:11:26 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 06:11:26 INFO data_plane.create_data_channel: Creating client data channel for localhost:44051
19/11/29 06:11:26 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 06:11:27 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 06:11:27 INFO sdk_worker.run: No more requests from control plane
19/11/29 06:11:27 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 06:11:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 06:11:27 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 06:11:27 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 06:11:27 INFO sdk_worker.run: Done consuming work.
19/11/29 06:11:27 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 06:11:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 06:11:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 06:11:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 06:11:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 06:11:27 INFO sdk_worker_main.main: Logging handler created.
19/11/29 06:11:27 INFO sdk_worker_main.start: Status HTTP server running at localhost:40567
19/11/29 06:11:27 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 06:11:27 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 06:11:27 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575007882.93_4ede7c62-9a82-402b-8731-afff60c28df3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 06:11:27 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575007882.93', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41345', 'job_port': u'0'}
19/11/29 06:11:27 INFO statecache.__init__: Creating state cache with size 0
19/11/29 06:11:27 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39163.
19/11/29 06:11:27 INFO sdk_worker.__init__: Control channel established.
19/11/29 06:11:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/29 06:11:27 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 06:11:27 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44101.
19/11/29 06:11:27 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 06:11:27 INFO data_plane.create_data_channel: Creating client data channel for localhost:44047
19/11/29 06:11:27 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 06:11:27 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 06:11:27 INFO sdk_worker.run: No more requests from control plane
19/11/29 06:11:27 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 06:11:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 06:11:27 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 06:11:27 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 06:11:27 INFO sdk_worker.run: Done consuming work.
19/11/29 06:11:27 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 06:11:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 06:11:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 06:11:27 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575007882.93_4ede7c62-9a82-402b-8731-afff60c28df3 finished.
19/11/29 06:11:27 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/29 06:11:27 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_bb078cef-db22-483a-8ea3-85f0ce69cb25","basePath":"/tmp/sparktestAIrvRJ"}: {}
java.io.FileNotFoundException: /tmp/sparktestAIrvRJ/job_bb078cef-db22-483a-8ea3-85f0ce69cb25/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next

    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
# Thread: <Thread(wait_until_finish_read, started daemon 140636883773184)>

    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-118, started daemon 140637236946688)>

# Thread: <_MainThread(MainThread, started 140638016186112)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(wait_until_finish_read, started daemon 140636875380480)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-124, started daemon 140636866987776)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-118, started daemon 140637236946688)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 140636883773184)>

# Thread: <_MainThread(MainThread, started 140638016186112)>
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575007874.29_a26ed542-8f8e-4a1b-b94b-47869cdd7acd failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 292.068s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 56s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/ksnchvfk64xhw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1651

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1651/display/redirect?page=changes>

Changes:

[milantracy] [BEAM-8406] Add support for JSON format text tables


------------------------------------------
[...truncated 1.31 MB...]
19/11/29 01:37:35 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1574991454.52_dc10f49f-1fb5-425d-b431-1b4587bd283c on Spark master local
19/11/29 01:37:35 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/11/29 01:37:35 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574991454.52_dc10f49f-1fb5-425d-b431-1b4587bd283c: Pipeline translated successfully. Computing outputs
19/11/29 01:37:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 01:37:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 01:37:36 INFO sdk_worker_main.main: Logging handler created.
19/11/29 01:37:36 INFO sdk_worker_main.start: Status HTTP server running at localhost:39697
19/11/29 01:37:36 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 01:37:36 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 01:37:36 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574991454.52_dc10f49f-1fb5-425d-b431-1b4587bd283c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 01:37:36 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574991454.52', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34069', 'job_port': u'0'}
19/11/29 01:37:36 INFO statecache.__init__: Creating state cache with size 0
19/11/29 01:37:36 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46491.
19/11/29 01:37:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/11/29 01:37:36 INFO sdk_worker.__init__: Control channel established.
19/11/29 01:37:36 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 01:37:36 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41617.
19/11/29 01:37:36 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 01:37:36 INFO data_plane.create_data_channel: Creating client data channel for localhost:39375
19/11/29 01:37:36 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 01:37:36 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 01:37:36 INFO sdk_worker.run: No more requests from control plane
19/11/29 01:37:36 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 01:37:36 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 01:37:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 01:37:36 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 01:37:36 INFO sdk_worker.run: Done consuming work.
19/11/29 01:37:36 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 01:37:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 01:37:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 01:37:36 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 01:37:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 01:37:37 INFO sdk_worker_main.main: Logging handler created.
19/11/29 01:37:37 INFO sdk_worker_main.start: Status HTTP server running at localhost:37475
19/11/29 01:37:37 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 01:37:37 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 01:37:37 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574991454.52_dc10f49f-1fb5-425d-b431-1b4587bd283c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 01:37:37 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574991454.52', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34069', 'job_port': u'0'}
19/11/29 01:37:37 INFO statecache.__init__: Creating state cache with size 0
19/11/29 01:37:37 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39535.
19/11/29 01:37:37 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/29 01:37:37 INFO sdk_worker.__init__: Control channel established.
19/11/29 01:37:37 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 01:37:37 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38385.
19/11/29 01:37:37 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 01:37:37 INFO data_plane.create_data_channel: Creating client data channel for localhost:37481
19/11/29 01:37:37 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 01:37:37 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 01:37:37 INFO sdk_worker.run: No more requests from control plane
19/11/29 01:37:37 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 01:37:37 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 01:37:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 01:37:37 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 01:37:37 INFO sdk_worker.run: Done consuming work.
19/11/29 01:37:37 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 01:37:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 01:37:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 01:37:38 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 01:37:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 01:37:39 INFO sdk_worker_main.main: Logging handler created.
19/11/29 01:37:39 INFO sdk_worker_main.start: Status HTTP server running at localhost:33617
19/11/29 01:37:39 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 01:37:39 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 01:37:39 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574991454.52_dc10f49f-1fb5-425d-b431-1b4587bd283c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 01:37:39 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574991454.52', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34069', 'job_port': u'0'}
19/11/29 01:37:39 INFO statecache.__init__: Creating state cache with size 0
19/11/29 01:37:39 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38643.
19/11/29 01:37:39 INFO sdk_worker.__init__: Control channel established.
19/11/29 01:37:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/29 01:37:39 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 01:37:39 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42013.
19/11/29 01:37:39 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 01:37:39 INFO data_plane.create_data_channel: Creating client data channel for localhost:45069
19/11/29 01:37:39 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 01:37:39 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 01:37:39 INFO sdk_worker.run: No more requests from control plane
19/11/29 01:37:39 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 01:37:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 01:37:39 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 01:37:39 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 01:37:39 INFO sdk_worker.run: Done consuming work.
19/11/29 01:37:39 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 01:37:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 01:37:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 01:37:39 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 01:37:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 01:37:40 INFO sdk_worker_main.main: Logging handler created.
19/11/29 01:37:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:41715
19/11/29 01:37:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 01:37:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 01:37:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574991454.52_dc10f49f-1fb5-425d-b431-1b4587bd283c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 01:37:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574991454.52', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34069', 'job_port': u'0'}
19/11/29 01:37:40 INFO statecache.__init__: Creating state cache with size 0
19/11/29 01:37:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36951.
19/11/29 01:37:40 INFO sdk_worker.__init__: Control channel established.
19/11/29 01:37:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 01:37:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/29 01:37:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34227.
19/11/29 01:37:40 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 01:37:40 INFO data_plane.create_data_channel: Creating client data channel for localhost:42807
19/11/29 01:37:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 01:37:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 01:37:40 INFO sdk_worker.run: No more requests from control plane
19/11/29 01:37:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 01:37:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 01:37:40 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 01:37:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 01:37:40 INFO sdk_worker.run: Done consuming work.
19/11/29 01:37:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 01:37:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 01:37:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 01:37:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 01:37:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 01:37:41 INFO sdk_worker_main.main: Logging handler created.
19/11/29 01:37:41 INFO sdk_worker_main.start: Status HTTP server running at localhost:36879
19/11/29 01:37:41 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 01:37:41 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 01:37:41 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574991454.52_dc10f49f-1fb5-425d-b431-1b4587bd283c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 01:37:41 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574991454.52', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34069', 'job_port': u'0'}
19/11/29 01:37:41 INFO statecache.__init__: Creating state cache with size 0
19/11/29 01:37:41 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37759.
19/11/29 01:37:41 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/29 01:37:41 INFO sdk_worker.__init__: Control channel established.
19/11/29 01:37:41 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 01:37:41 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33627.
19/11/29 01:37:41 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 01:37:41 INFO data_plane.create_data_channel: Creating client data channel for localhost:33507
19/11/29 01:37:41 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 01:37:41 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 01:37:41 INFO sdk_worker.run: No more requests from control plane
19/11/29 01:37:41 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 01:37:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 01:37:41 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 01:37:41 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 01:37:41 INFO sdk_worker.run: Done consuming work.
19/11/29 01:37:41 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 01:37:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 01:37:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 01:37:41 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574991454.52_dc10f49f-1fb5-425d-b431-1b4587bd283c finished.
19/11/29 01:37:41 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/29 01:37:41 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_152e6e81-7877-4988-8d79-1fbaae46340b","basePath":"/tmp/sparktestZOrA2j"}: {}
java.io.FileNotFoundException: /tmp/sparktestZOrA2j/job_152e6e81-7877-4988-8d79-1fbaae46340b/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
==================== Timed out after 60 seconds. ====================
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once

# Thread: <Thread(wait_until_finish_read, started daemon 139828471441152)>

    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
# Thread: <Thread(Thread-119, started daemon 139828488226560)>

Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <_MainThread(MainThread, started 139829338294016)>
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574991443.9_c80e0d24-c7cd-4947-9bdc-50653bf49598 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 286.874s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 21s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/pvpz7nvgxzics

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1650

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1650/display/redirect>

Changes:


------------------------------------------
[...truncated 1.31 MB...]
19/11/29 00:15:08 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34953.
19/11/29 00:15:08 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 00:15:08 INFO data_plane.create_data_channel: Creating client data channel for localhost:35509
19/11/29 00:15:08 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 00:15:08 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 00:15:08 INFO sdk_worker.run: No more requests from control plane
19/11/29 00:15:08 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 00:15:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 00:15:08 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 00:15:08 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 00:15:08 INFO sdk_worker.run: Done consuming work.
19/11/29 00:15:08 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 00:15:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 00:15:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 00:15:08 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 00:15:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 00:15:09 INFO sdk_worker_main.main: Logging handler created.
19/11/29 00:15:09 INFO sdk_worker_main.start: Status HTTP server running at localhost:42109
19/11/29 00:15:09 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 00:15:09 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 00:15:09 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574986506.55_ed89aacb-d6dd-465d-a448-52124f6bd9b4', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 00:15:09 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574986506.55', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48649', 'job_port': u'0'}
19/11/29 00:15:09 INFO statecache.__init__: Creating state cache with size 0
19/11/29 00:15:09 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41497.
19/11/29 00:15:09 INFO sdk_worker.__init__: Control channel established.
19/11/29 00:15:09 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 00:15:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/29 00:15:09 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44863.
19/11/29 00:15:09 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 00:15:09 INFO data_plane.create_data_channel: Creating client data channel for localhost:33501
19/11/29 00:15:09 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 00:15:09 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 00:15:09 INFO sdk_worker.run: No more requests from control plane
19/11/29 00:15:09 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 00:15:09 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 00:15:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 00:15:09 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 00:15:09 INFO sdk_worker.run: Done consuming work.
19/11/29 00:15:09 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 00:15:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 00:15:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 00:15:09 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 00:15:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 00:15:09 INFO sdk_worker_main.main: Logging handler created.
19/11/29 00:15:09 INFO sdk_worker_main.start: Status HTTP server running at localhost:43469
19/11/29 00:15:09 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 00:15:09 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 00:15:09 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574986506.55_ed89aacb-d6dd-465d-a448-52124f6bd9b4', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 00:15:09 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574986506.55', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48649', 'job_port': u'0'}
19/11/29 00:15:09 INFO statecache.__init__: Creating state cache with size 0
19/11/29 00:15:09 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46185.
19/11/29 00:15:09 INFO sdk_worker.__init__: Control channel established.
19/11/29 00:15:09 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 00:15:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/29 00:15:10 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42357.
19/11/29 00:15:10 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 00:15:10 INFO data_plane.create_data_channel: Creating client data channel for localhost:45441
19/11/29 00:15:10 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 00:15:10 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 00:15:10 INFO sdk_worker.run: No more requests from control plane
19/11/29 00:15:10 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 00:15:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 00:15:10 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 00:15:10 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 00:15:10 INFO sdk_worker.run: Done consuming work.
19/11/29 00:15:10 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 00:15:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 00:15:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 00:15:10 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 00:15:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 00:15:10 INFO sdk_worker_main.main: Logging handler created.
19/11/29 00:15:10 INFO sdk_worker_main.start: Status HTTP server running at localhost:41001
19/11/29 00:15:10 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 00:15:10 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 00:15:10 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574986506.55_ed89aacb-d6dd-465d-a448-52124f6bd9b4', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 00:15:10 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574986506.55', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48649', 'job_port': u'0'}
19/11/29 00:15:10 INFO statecache.__init__: Creating state cache with size 0
19/11/29 00:15:10 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36987.
19/11/29 00:15:10 INFO sdk_worker.__init__: Control channel established.
19/11/29 00:15:10 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/29 00:15:10 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 00:15:10 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36841.
19/11/29 00:15:10 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 00:15:10 INFO data_plane.create_data_channel: Creating client data channel for localhost:46021
19/11/29 00:15:10 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 00:15:10 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 00:15:10 INFO sdk_worker.run: No more requests from control plane
19/11/29 00:15:10 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 00:15:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 00:15:10 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 00:15:10 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 00:15:10 INFO sdk_worker.run: Done consuming work.
19/11/29 00:15:10 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 00:15:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 00:15:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 00:15:11 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 00:15:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 00:15:11 INFO sdk_worker_main.main: Logging handler created.
19/11/29 00:15:11 INFO sdk_worker_main.start: Status HTTP server running at localhost:45481
19/11/29 00:15:11 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 00:15:11 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 00:15:11 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574986506.55_ed89aacb-d6dd-465d-a448-52124f6bd9b4', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 00:15:11 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574986506.55', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48649', 'job_port': u'0'}
19/11/29 00:15:11 INFO statecache.__init__: Creating state cache with size 0
19/11/29 00:15:11 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37199.
19/11/29 00:15:11 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/29 00:15:11 INFO sdk_worker.__init__: Control channel established.
19/11/29 00:15:11 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 00:15:11 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35381.
19/11/29 00:15:11 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 00:15:11 INFO data_plane.create_data_channel: Creating client data channel for localhost:36523
19/11/29 00:15:11 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 00:15:11 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 00:15:11 INFO sdk_worker.run: No more requests from control plane
19/11/29 00:15:11 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 00:15:11 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 00:15:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 00:15:11 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 00:15:11 INFO sdk_worker.run: Done consuming work.
19/11/29 00:15:11 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 00:15:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 00:15:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 00:15:11 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574986506.55_ed89aacb-d6dd-465d-a448-52124f6bd9b4 finished.
19/11/29 00:15:11 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/29 00:15:11 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_4c87db9a-2362-43a4-8fc0-3bf93b5d00d7","basePath":"/tmp/sparktestfCu1th"}: {}
java.io.FileNotFoundException: /tmp/sparktestfCu1th/job_4c87db9a-2362-43a4-8fc0-3bf93b5d00d7/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
==================== Timed out after 60 seconds. ====================

ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140161978164992)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(Thread-120, started daemon 140161986557696)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574986497.17_4451bb5c-06f7-4f54-9832-4c2cd5f4e1e5 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <_MainThread(MainThread, started 140162776860416)>
----------------------------------------------------------------------
Ran 38 tests in 299.835s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 35s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...

Publishing failed.

The response from https://scans-in.gradle.com/in/5.2.1/2.3 was not from the build scan server.
Your network environment may be interfering, or the service may be unavailable.

If you believe this to be in error, please report this problem via https://gradle.com/scans/help/plugin and include the following via copy/paste:

----------
Gradle version: 5.2.1
Plugin version: 2.3
Request URL: https://scans-in.gradle.com/in/5.2.1/2.3
Request ID: 7392d0df-69d3-4bd2-9090-2d798b99440a
Response status code: 502
Response content type: text/html; charset=UTF-8
Response server type: cloudflare
----------

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1649

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1649/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/11/28 18:14:17 INFO sdk_worker_main.start: Status HTTP server running at localhost:37201
19/11/28 18:14:17 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 18:14:17 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 18:14:17 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574964854.35_3a9e48a6-8429-458a-9fde-a5e77bf61044', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 18:14:17 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574964854.35', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40867', 'job_port': u'0'}
19/11/28 18:14:17 INFO statecache.__init__: Creating state cache with size 0
19/11/28 18:14:17 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46535.
19/11/28 18:14:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/28 18:14:17 INFO sdk_worker.__init__: Control channel established.
19/11/28 18:14:17 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 18:14:17 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36715.
19/11/28 18:14:17 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 18:14:17 INFO data_plane.create_data_channel: Creating client data channel for localhost:45499
19/11/28 18:14:17 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 18:14:17 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 18:14:17 INFO sdk_worker.run: No more requests from control plane
19/11/28 18:14:17 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 18:14:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 18:14:17 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 18:14:17 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 18:14:17 INFO sdk_worker.run: Done consuming work.
19/11/28 18:14:17 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 18:14:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 18:14:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 18:14:17 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 18:14:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 18:14:18 INFO sdk_worker_main.main: Logging handler created.
19/11/28 18:14:18 INFO sdk_worker_main.start: Status HTTP server running at localhost:34085
19/11/28 18:14:18 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 18:14:18 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 18:14:18 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574964854.35_3a9e48a6-8429-458a-9fde-a5e77bf61044', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 18:14:18 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574964854.35', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40867', 'job_port': u'0'}
19/11/28 18:14:18 INFO statecache.__init__: Creating state cache with size 0
19/11/28 18:14:18 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46433.
19/11/28 18:14:18 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/28 18:14:18 INFO sdk_worker.__init__: Control channel established.
19/11/28 18:14:18 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 18:14:18 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33413.
19/11/28 18:14:18 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 18:14:18 INFO data_plane.create_data_channel: Creating client data channel for localhost:38639
19/11/28 18:14:18 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 18:14:18 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 18:14:18 INFO sdk_worker.run: No more requests from control plane
19/11/28 18:14:18 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 18:14:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 18:14:18 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 18:14:18 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 18:14:18 INFO sdk_worker.run: Done consuming work.
19/11/28 18:14:18 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 18:14:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 18:14:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 18:14:18 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 18:14:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 18:14:19 INFO sdk_worker_main.main: Logging handler created.
19/11/28 18:14:19 INFO sdk_worker_main.start: Status HTTP server running at localhost:33269
19/11/28 18:14:19 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 18:14:19 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 18:14:19 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574964854.35_3a9e48a6-8429-458a-9fde-a5e77bf61044', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 18:14:19 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574964854.35', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40867', 'job_port': u'0'}
19/11/28 18:14:19 INFO statecache.__init__: Creating state cache with size 0
19/11/28 18:14:19 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46051.
19/11/28 18:14:19 INFO sdk_worker.__init__: Control channel established.
19/11/28 18:14:19 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 18:14:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/28 18:14:19 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38331.
19/11/28 18:14:19 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 18:14:19 INFO data_plane.create_data_channel: Creating client data channel for localhost:38693
19/11/28 18:14:19 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 18:14:19 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 18:14:19 INFO sdk_worker.run: No more requests from control plane
19/11/28 18:14:19 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 18:14:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 18:14:19 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 18:14:19 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 18:14:19 INFO sdk_worker.run: Done consuming work.
19/11/28 18:14:19 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 18:14:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 18:14:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 18:14:19 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 18:14:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 18:14:20 INFO sdk_worker_main.main: Logging handler created.
19/11/28 18:14:20 INFO sdk_worker_main.start: Status HTTP server running at localhost:39253
19/11/28 18:14:20 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 18:14:20 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 18:14:20 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574964854.35_3a9e48a6-8429-458a-9fde-a5e77bf61044', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 18:14:20 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574964854.35', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40867', 'job_port': u'0'}
19/11/28 18:14:20 INFO statecache.__init__: Creating state cache with size 0
19/11/28 18:14:20 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36793.
19/11/28 18:14:20 INFO sdk_worker.__init__: Control channel established.
19/11/28 18:14:20 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 18:14:20 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/28 18:14:20 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38779.
19/11/28 18:14:20 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 18:14:20 INFO data_plane.create_data_channel: Creating client data channel for localhost:36697
19/11/28 18:14:20 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 18:14:20 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 18:14:20 INFO sdk_worker.run: No more requests from control plane
19/11/28 18:14:20 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 18:14:20 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 18:14:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 18:14:20 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 18:14:20 INFO sdk_worker.run: Done consuming work.
19/11/28 18:14:20 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 18:14:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 18:14:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 18:14:20 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574964854.35_3a9e48a6-8429-458a-9fde-a5e77bf61044 finished.
19/11/28 18:14:20 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/28 18:14:20 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_fa3ed5b1-7c2a-42a7-933d-a31963c1d0c5","basePath":"/tmp/sparktestBx1X1F"}: {}
java.io.FileNotFoundException: /tmp/sparktestBx1X1F/job_fa3ed5b1-7c2a-42a7-933d-a31963c1d0c5/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139901823129344)>

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(Thread-119, started daemon 139901814736640)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 139902611261184)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 139901797426944)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-124, started daemon 139901806081792)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-119, started daemon 139901814736640)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
# Thread: <_MainThread(MainThread, started 139902611261184)>

# Thread: <Thread(wait_until_finish_read, started daemon 139901823129344)>
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574964843.24_bbd1a1e1-5f35-40f2-b3e9-7c55bfefa023 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 353.300s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 46s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/5ldhideppofye

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1648

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1648/display/redirect?page=changes>

Changes:

[mxm] [BEAM-8656] Update documentation for flink_master parameter


------------------------------------------
[...truncated 1.32 MB...]
19/11/28 14:35:04 INFO data_plane.create_data_channel: Creating client data channel for localhost:40935
19/11/28 14:35:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 14:35:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 14:35:05 INFO sdk_worker.run: No more requests from control plane
19/11/28 14:35:05 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 14:35:05 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 14:35:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 14:35:05 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 14:35:05 INFO sdk_worker.run: Done consuming work.
19/11/28 14:35:05 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 14:35:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 14:35:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 14:35:05 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 14:35:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 14:35:05 INFO sdk_worker_main.main: Logging handler created.
19/11/28 14:35:05 INFO sdk_worker_main.start: Status HTTP server running at localhost:32881
19/11/28 14:35:05 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 14:35:05 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 14:35:05 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574951702.43_4f27f123-ce51-4955-af39-7ffa2192a2a0', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 14:35:05 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574951702.43', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50283', 'job_port': u'0'}
19/11/28 14:35:05 INFO statecache.__init__: Creating state cache with size 0
19/11/28 14:35:05 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33943.
19/11/28 14:35:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/28 14:35:05 INFO sdk_worker.__init__: Control channel established.
19/11/28 14:35:05 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 14:35:05 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34067.
19/11/28 14:35:05 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 14:35:05 INFO data_plane.create_data_channel: Creating client data channel for localhost:40431
19/11/28 14:35:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 14:35:06 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 14:35:06 INFO sdk_worker.run: No more requests from control plane
19/11/28 14:35:06 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 14:35:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 14:35:06 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 14:35:06 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 14:35:06 INFO sdk_worker.run: Done consuming work.
19/11/28 14:35:06 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 14:35:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 14:35:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 14:35:06 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 14:35:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 14:35:06 INFO sdk_worker_main.main: Logging handler created.
19/11/28 14:35:06 INFO sdk_worker_main.start: Status HTTP server running at localhost:40447
19/11/28 14:35:06 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 14:35:06 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 14:35:06 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574951702.43_4f27f123-ce51-4955-af39-7ffa2192a2a0', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 14:35:06 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574951702.43', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50283', 'job_port': u'0'}
19/11/28 14:35:06 INFO statecache.__init__: Creating state cache with size 0
19/11/28 14:35:06 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34197.
19/11/28 14:35:06 INFO sdk_worker.__init__: Control channel established.
19/11/28 14:35:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/28 14:35:06 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 14:35:06 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:32805.
19/11/28 14:35:06 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 14:35:06 INFO data_plane.create_data_channel: Creating client data channel for localhost:38987
19/11/28 14:35:06 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 14:35:06 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 14:35:06 INFO sdk_worker.run: No more requests from control plane
19/11/28 14:35:06 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 14:35:06 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 14:35:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 14:35:06 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 14:35:06 INFO sdk_worker.run: Done consuming work.
19/11/28 14:35:06 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 14:35:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 14:35:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 14:35:07 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 14:35:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 14:35:07 INFO sdk_worker_main.main: Logging handler created.
19/11/28 14:35:07 INFO sdk_worker_main.start: Status HTTP server running at localhost:35563
19/11/28 14:35:07 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 14:35:07 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 14:35:07 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574951702.43_4f27f123-ce51-4955-af39-7ffa2192a2a0', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 14:35:07 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574951702.43', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50283', 'job_port': u'0'}
19/11/28 14:35:07 INFO statecache.__init__: Creating state cache with size 0
19/11/28 14:35:07 INFO sdk_worker.__init__: Creating insecure control channel for localhost:32911.
19/11/28 14:35:07 INFO sdk_worker.__init__: Control channel established.
19/11/28 14:35:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/28 14:35:07 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 14:35:07 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45251.
19/11/28 14:35:07 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 14:35:07 INFO data_plane.create_data_channel: Creating client data channel for localhost:38299
19/11/28 14:35:07 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 14:35:07 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 14:35:07 INFO sdk_worker.run: No more requests from control plane
19/11/28 14:35:07 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 14:35:07 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 14:35:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 14:35:07 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 14:35:07 INFO sdk_worker.run: Done consuming work.
19/11/28 14:35:07 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 14:35:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 14:35:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 14:35:07 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574951702.43_4f27f123-ce51-4955-af39-7ffa2192a2a0 finished.
19/11/28 14:35:07 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/28 14:35:07 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_0e5c2d4e-9b09-4a38-8456-5171e1826f76","basePath":"/tmp/sparktestsViQPm"}: {}
java.io.FileNotFoundException: /tmp/sparktestsViQPm/job_0e5c2d4e-9b09-4a38-8456-5171e1826f76/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139822749406976)>

# Thread: <Thread(Thread-119, started daemon 139822766192384)>
BaseException: Timed out after 60 seconds.

======================================================================

ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <_MainThread(MainThread, started 139823545640704)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_ru# Thread: <Thread(wait_until_finish_read, started daemon 139822123448064)>
nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:

# Thread: <Thread(Thread-124, started daemon 139822740227840)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 139823545640704)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574951693.11_b93923b1-4ff8-4d14-a784-f26d4d84b26b failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 298.706s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 28s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...

Publishing failed.

The response from https://scans-in.gradle.com/in/5.2.1/2.3 was not from the build scan server.
Your network environment may be interfering, or the service may be unavailable.

If you believe this to be in error, please report this problem via https://gradle.com/scans/help/plugin and include the following via copy/paste:

----------
Gradle version: 5.2.1
Plugin version: 2.3
Request URL: https://scans-in.gradle.com/in/5.2.1/2.3
Request ID: ae6c3e66-1515-47d0-910c-e587ce3c8145
Response status code: 503
Response content type: text/html; charset=utf-8
Response server type: cloudflare
----------

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1647

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1647/display/redirect?page=changes>

Changes:

[piotr.szczepanik] [BEAM-8819] Fix AvroCoder serialisation by introduction of

[piotr.szczepanik] Added missing license header for AvroGenericCoder

[piotr.szczepanik] Fixed code style violations

[piotr.szczepanik] Fixed missing AvroCoder -> AvroGenericCoder in python tests

[coheigea] A fix for some TLS issues in the MongoDB IO


------------------------------------------
[...truncated 1.42 MB...]
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
read(wait_until_finish_read, started daemon 140401583310592)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
# Thread: <Thread(Thread-135, started daemon 140401591703296)>

    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 140401600096000)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 140403418679040)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-125, started daemon 140402103396096)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(wait_until_finish_read, started daemon 140402095003392)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(Thread-130, started daemon 140401608488704)>
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_unfusable_side_inputs (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 244, in test_pardo_unfusable_side_inputs
    equal_to([('a', 'a'), ('a', 'b'), ('b', 'a'), ('b', 'b')]))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_windowed_side_inputs (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 181, in test_pardo_windowed_side_inputs
    label='windowed')
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_read (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 578, in test_read
    equal_to(['a', 'b', 'c']))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_reshuffle (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 548, in test_reshuffle
    equal_to([1, 2, 3]))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_check_done_failed (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 470, in test_sdf_with_check_done_failed
    | beam.ParDo(ExpandingStringsDoFn()))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574949668.15_6f5b9749-c444-42b9-969c-1df39118c542 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 755.924s

FAILED (errors=8, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 16m 17s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...

Publishing failed.

The response from https://scans-in.gradle.com/in/5.2.1/2.3 was not from the build scan server.
Your network environment may be interfering, or the service may be unavailable.

If you believe this to be in error, please report this problem via https://gradle.com/scans/help/plugin and include the following via copy/paste:

----------
Gradle version: 5.2.1
Plugin version: 2.3
Request URL: https://scans-in.gradle.com/in/5.2.1/2.3
Request ID: 8530ed8e-ca5d-465b-8c69-b28eb443a8ac
Response status code: 503
Response content type: text/html; charset=utf-8
Response server type: cloudflare
----------

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1646

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1646/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]

19/11/28 12:14:00 INFO sdk_worker.run: No more requests from control plane
19/11/28 12:14:00 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 12:14:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 12:14:00 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 12:14:00 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 12:14:00 INFO sdk_worker.run: Done consuming work.
19/11/28 12:14:00 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 12:14:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 12:14:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 12:14:00 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 12:14:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 12:14:01 INFO sdk_worker_main.main: Logging handler created.
19/11/28 12:14:01 INFO sdk_worker_main.start: Status HTTP server running at localhost:38815
19/11/28 12:14:01 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 12:14:01 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 12:14:01 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574943238.09_4de6c956-10c8-4539-852f-15cf4f277972', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 12:14:01 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574943238.09', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:38275', 'job_port': u'0'}
19/11/28 12:14:01 INFO statecache.__init__: Creating state cache with size 0
19/11/28 12:14:01 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45223.
19/11/28 12:14:01 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/28 12:14:01 INFO sdk_worker.__init__: Control channel established.
19/11/28 12:14:01 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 12:14:01 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44935.
19/11/28 12:14:01 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 12:14:01 INFO data_plane.create_data_channel: Creating client data channel for localhost:37205
19/11/28 12:14:01 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 12:14:01 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 12:14:01 INFO sdk_worker.run: No more requests from control plane
19/11/28 12:14:01 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 12:14:01 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 12:14:01 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 12:14:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 12:14:01 INFO sdk_worker.run: Done consuming work.
19/11/28 12:14:01 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 12:14:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 12:14:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 12:14:01 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 12:14:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 12:14:02 INFO sdk_worker_main.main: Logging handler created.
19/11/28 12:14:02 INFO sdk_worker_main.start: Status HTTP server running at localhost:41311
19/11/28 12:14:02 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 12:14:02 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 12:14:02 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574943238.09_4de6c956-10c8-4539-852f-15cf4f277972', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 12:14:02 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574943238.09', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:38275', 'job_port': u'0'}
19/11/28 12:14:02 INFO statecache.__init__: Creating state cache with size 0
19/11/28 12:14:02 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39427.
19/11/28 12:14:02 INFO sdk_worker.__init__: Control channel established.
19/11/28 12:14:02 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 12:14:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/28 12:14:02 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43007.
19/11/28 12:14:02 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 12:14:02 INFO data_plane.create_data_channel: Creating client data channel for localhost:36617
19/11/28 12:14:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 12:14:02 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 12:14:02 INFO sdk_worker.run: No more requests from control plane
19/11/28 12:14:02 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 12:14:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 12:14:02 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 12:14:02 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 12:14:02 INFO sdk_worker.run: Done consuming work.
19/11/28 12:14:02 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 12:14:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 12:14:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 12:14:02 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 12:14:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 12:14:03 INFO sdk_worker_main.main: Logging handler created.
19/11/28 12:14:03 INFO sdk_worker_main.start: Status HTTP server running at localhost:42407
19/11/28 12:14:03 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 12:14:03 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 12:14:03 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574943238.09_4de6c956-10c8-4539-852f-15cf4f277972', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 12:14:03 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574943238.09', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:38275', 'job_port': u'0'}
19/11/28 12:14:03 INFO statecache.__init__: Creating state cache with size 0
19/11/28 12:14:03 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40333.
19/11/28 12:14:03 INFO sdk_worker.__init__: Control channel established.
19/11/28 12:14:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/28 12:14:03 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 12:14:03 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40295.
19/11/28 12:14:03 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 12:14:03 INFO data_plane.create_data_channel: Creating client data channel for localhost:36579
19/11/28 12:14:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 12:14:03 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 12:14:03 INFO sdk_worker.run: No more requests from control plane
19/11/28 12:14:03 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 12:14:03 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 12:14:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 12:14:03 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 12:14:03 INFO sdk_worker.run: Done consuming work.
19/11/28 12:14:03 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 12:14:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 12:14:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 12:14:03 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574943238.09_4de6c956-10c8-4539-852f-15cf4f277972 finished.
19/11/28 12:14:03 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/28 12:14:03 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_597fd3fa-4b40-42fc-9364-a8d9e1cf63da","basePath":"/tmp/sparktestK4RKHK"}: {}
java.io.FileNotFoundException: /tmp/sparktestK4RKHK/job_597fd3fa-4b40-42fc-9364-a8d9e1cf63da/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
==================== Timed out after 60 seconds. ====================

    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 139931569612544)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-120, started daemon 139931578005248)>

# Thread: <_MainThread(MainThread, started 139932706768640)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 139931561219840)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-126, started daemon 139931552827136)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(Thread-120, started daemon 139931578005248)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 139932706768640)>

# Thread: <Thread(wait_until_finish_read, started daemon 139931569612544)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574943229.32_d4095ec5-e217-4456-a9a8-56bd8ebd5264 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 310.523s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 42s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...

Publishing failed.

The response from https://scans-in.gradle.com/in/5.2.1/2.3 was not from the build scan server.
Your network environment may be interfering, or the service may be unavailable.

If you believe this to be in error, please report this problem via https://gradle.com/scans/help/plugin and include the following via copy/paste:

----------
Gradle version: 5.2.1
Plugin version: 2.3
Request URL: https://scans-in.gradle.com/in/5.2.1/2.3
Request ID: 162fa53d-c55c-4802-a633-0c993d02e40a
Response status code: 503
Response content type: text/html; charset=utf-8
Response server type: cloudflare
----------

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1645

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1645/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]

19/11/28 06:14:54 INFO sdk_worker.run: No more requests from control plane
19/11/28 06:14:54 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 06:14:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 06:14:54 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 06:14:54 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 06:14:54 INFO sdk_worker.run: Done consuming work.
19/11/28 06:14:54 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 06:14:54 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 06:14:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 06:14:54 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 06:14:55 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 06:14:55 INFO sdk_worker_main.main: Logging handler created.
19/11/28 06:14:55 INFO sdk_worker_main.start: Status HTTP server running at localhost:44249
19/11/28 06:14:55 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 06:14:55 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 06:14:55 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574921691.59_f4a4491f-78ad-4d01-afee-d796008ad3fb', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 06:14:55 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574921691.59', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48831', 'job_port': u'0'}
19/11/28 06:14:55 INFO statecache.__init__: Creating state cache with size 0
19/11/28 06:14:55 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44797.
19/11/28 06:14:55 INFO sdk_worker.__init__: Control channel established.
19/11/28 06:14:55 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/28 06:14:55 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 06:14:55 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35441.
19/11/28 06:14:55 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 06:14:55 INFO data_plane.create_data_channel: Creating client data channel for localhost:42235
19/11/28 06:14:55 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 06:14:55 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 06:14:55 INFO sdk_worker.run: No more requests from control plane
19/11/28 06:14:55 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 06:14:55 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 06:14:55 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 06:14:55 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 06:14:55 INFO sdk_worker.run: Done consuming work.
19/11/28 06:14:55 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 06:14:55 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 06:14:55 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 06:14:55 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 06:14:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 06:14:56 INFO sdk_worker_main.main: Logging handler created.
19/11/28 06:14:56 INFO sdk_worker_main.start: Status HTTP server running at localhost:45305
19/11/28 06:14:56 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 06:14:56 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 06:14:56 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574921691.59_f4a4491f-78ad-4d01-afee-d796008ad3fb', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 06:14:56 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574921691.59', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48831', 'job_port': u'0'}
19/11/28 06:14:56 INFO statecache.__init__: Creating state cache with size 0
19/11/28 06:14:56 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44129.
19/11/28 06:14:56 INFO sdk_worker.__init__: Control channel established.
19/11/28 06:14:56 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/28 06:14:56 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 06:14:56 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40233.
19/11/28 06:14:56 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 06:14:56 INFO data_plane.create_data_channel: Creating client data channel for localhost:42161
19/11/28 06:14:56 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 06:14:56 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 06:14:56 INFO sdk_worker.run: No more requests from control plane
19/11/28 06:14:56 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 06:14:56 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 06:14:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 06:14:56 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 06:14:56 INFO sdk_worker.run: Done consuming work.
19/11/28 06:14:56 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 06:14:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 06:14:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 06:14:56 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 06:14:57 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 06:14:57 INFO sdk_worker_main.main: Logging handler created.
19/11/28 06:14:57 INFO sdk_worker_main.start: Status HTTP server running at localhost:39735
19/11/28 06:14:57 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 06:14:57 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 06:14:57 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574921691.59_f4a4491f-78ad-4d01-afee-d796008ad3fb', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 06:14:57 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574921691.59', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48831', 'job_port': u'0'}
19/11/28 06:14:57 INFO statecache.__init__: Creating state cache with size 0
19/11/28 06:14:57 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33991.
19/11/28 06:14:57 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/28 06:14:57 INFO sdk_worker.__init__: Control channel established.
19/11/28 06:14:57 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 06:14:57 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46677.
19/11/28 06:14:57 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 06:14:57 INFO data_plane.create_data_channel: Creating client data channel for localhost:42811
19/11/28 06:14:57 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 06:14:57 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 06:14:57 INFO sdk_worker.run: No more requests from control plane
19/11/28 06:14:57 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 06:14:57 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 06:14:57 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 06:14:57 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 06:14:57 INFO sdk_worker.run: Done consuming work.
19/11/28 06:14:57 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 06:14:57 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 06:14:57 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 06:14:57 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574921691.59_f4a4491f-78ad-4d01-afee-d796008ad3fb finished.
19/11/28 06:14:57 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/28 06:14:57 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_f3c16b7c-6678-4432-bb12-824a5508fcbc","basePath":"/tmp/sparktesttErq5x"}: {}
java.io.FileNotFoundException: /tmp/sparktesttErq5x/job_f3c16b7c-6678-4432-bb12-824a5508fcbc/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
==================== Timed out after 60 seconds. ====================
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139773872174848)>

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)

# Thread: <Thread(Thread-118, started daemon 139773863782144)>

----------------------------------------------------------------------
# Thread: <_MainThread(MainThread, started 139774994171648)>
Traceback (most recent call last):
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 139773838604032)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-124, started daemon 139773846996736)>

# Thread: <Thread(Thread-118, started daemon 139773863782144)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 139774994171648)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 139773872174848)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574921679.37_8c7502a3-82b7-41d1-9f9f-32555c68f686 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 364.459s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 15s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...

Publishing failed.

The response from https://scans-in.gradle.com/in/5.2.1/2.3 was not from the build scan server.
Your network environment may be interfering, or the service may be unavailable.

If you believe this to be in error, please report this problem via https://gradle.com/scans/help/plugin and include the following via copy/paste:

----------
Gradle version: 5.2.1
Plugin version: 2.3
Request URL: https://scans-in.gradle.com/in/5.2.1/2.3
Request ID: 41b4815c-91bd-4209-8c52-ac5b176e3907
Response status code: 503
Response content type: text/html; charset=utf-8
Response server type: cloudflare
----------

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1644

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1644/display/redirect?page=changes>

Changes:

[ehudm] [BEAM-8842] Temporarily disable test


------------------------------------------
[...truncated 1.31 MB...]
19/11/28 01:21:01 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43607.
19/11/28 01:21:01 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 01:21:01 INFO data_plane.create_data_channel: Creating client data channel for localhost:45483
19/11/28 01:21:01 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 01:21:01 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 01:21:01 INFO sdk_worker.run: No more requests from control plane
19/11/28 01:21:01 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 01:21:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 01:21:01 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 01:21:01 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 01:21:01 INFO sdk_worker.run: Done consuming work.
19/11/28 01:21:01 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 01:21:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 01:21:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 01:21:01 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 01:21:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 01:21:02 INFO sdk_worker_main.main: Logging handler created.
19/11/28 01:21:02 INFO sdk_worker_main.start: Status HTTP server running at localhost:37387
19/11/28 01:21:02 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 01:21:02 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 01:21:02 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574904059.7_5141349a-a45a-45e4-bdc3-292aa594092f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 01:21:02 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574904059.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56129', 'job_port': u'0'}
19/11/28 01:21:02 INFO statecache.__init__: Creating state cache with size 0
19/11/28 01:21:02 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38797.
19/11/28 01:21:02 INFO sdk_worker.__init__: Control channel established.
19/11/28 01:21:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/28 01:21:02 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 01:21:02 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42703.
19/11/28 01:21:02 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 01:21:02 INFO data_plane.create_data_channel: Creating client data channel for localhost:38457
19/11/28 01:21:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 01:21:02 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 01:21:02 INFO sdk_worker.run: No more requests from control plane
19/11/28 01:21:02 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 01:21:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 01:21:02 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 01:21:02 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 01:21:02 INFO sdk_worker.run: Done consuming work.
19/11/28 01:21:02 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 01:21:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 01:21:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 01:21:02 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 01:21:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 01:21:03 INFO sdk_worker_main.main: Logging handler created.
19/11/28 01:21:03 INFO sdk_worker_main.start: Status HTTP server running at localhost:40983
19/11/28 01:21:03 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 01:21:03 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 01:21:03 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574904059.7_5141349a-a45a-45e4-bdc3-292aa594092f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 01:21:03 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574904059.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56129', 'job_port': u'0'}
19/11/28 01:21:03 INFO statecache.__init__: Creating state cache with size 0
19/11/28 01:21:03 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43823.
19/11/28 01:21:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/28 01:21:03 INFO sdk_worker.__init__: Control channel established.
19/11/28 01:21:03 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 01:21:03 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39975.
19/11/28 01:21:03 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 01:21:03 INFO data_plane.create_data_channel: Creating client data channel for localhost:37249
19/11/28 01:21:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 01:21:03 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 01:21:03 INFO sdk_worker.run: No more requests from control plane
19/11/28 01:21:03 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 01:21:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 01:21:03 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 01:21:03 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 01:21:03 INFO sdk_worker.run: Done consuming work.
19/11/28 01:21:03 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 01:21:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 01:21:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 01:21:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 01:21:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 01:21:03 INFO sdk_worker_main.main: Logging handler created.
19/11/28 01:21:03 INFO sdk_worker_main.start: Status HTTP server running at localhost:35237
19/11/28 01:21:03 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 01:21:03 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 01:21:03 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574904059.7_5141349a-a45a-45e4-bdc3-292aa594092f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 01:21:03 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574904059.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56129', 'job_port': u'0'}
19/11/28 01:21:03 INFO statecache.__init__: Creating state cache with size 0
19/11/28 01:21:03 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33061.
19/11/28 01:21:03 INFO sdk_worker.__init__: Control channel established.
19/11/28 01:21:03 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 01:21:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/28 01:21:03 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33051.
19/11/28 01:21:03 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 01:21:03 INFO data_plane.create_data_channel: Creating client data channel for localhost:43049
19/11/28 01:21:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 01:21:04 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 01:21:04 INFO sdk_worker.run: No more requests from control plane
19/11/28 01:21:04 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 01:21:04 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 01:21:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 01:21:04 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 01:21:04 INFO sdk_worker.run: Done consuming work.
19/11/28 01:21:04 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 01:21:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 01:21:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 01:21:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 01:21:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 01:21:04 INFO sdk_worker_main.main: Logging handler created.
19/11/28 01:21:04 INFO sdk_worker_main.start: Status HTTP server running at localhost:39347
19/11/28 01:21:04 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 01:21:04 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 01:21:04 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574904059.7_5141349a-a45a-45e4-bdc3-292aa594092f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 01:21:04 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574904059.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56129', 'job_port': u'0'}
19/11/28 01:21:04 INFO statecache.__init__: Creating state cache with size 0
19/11/28 01:21:04 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33883.
19/11/28 01:21:04 INFO sdk_worker.__init__: Control channel established.
19/11/28 01:21:04 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 01:21:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/28 01:21:04 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39815.
19/11/28 01:21:04 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 01:21:04 INFO data_plane.create_data_channel: Creating client data channel for localhost:36559
19/11/28 01:21:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 01:21:04 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 01:21:04 INFO sdk_worker.run: No more requests from control plane
19/11/28 01:21:04 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 01:21:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 01:21:04 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 01:21:04 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 01:21:04 INFO sdk_worker.run: Done consuming work.
19/11/28 01:21:04 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 01:21:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 01:21:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 01:21:04 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574904059.7_5141349a-a45a-45e4-bdc3-292aa594092f finished.
19/11/28 01:21:04 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/28 01:21:04 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_557239a3-e385-4e7b-95bc-5d6512571339","basePath":"/tmp/sparktestHXa7q3"}: {}
java.io.FileNotFoundException: /tmp/sparktestHXa7q3/job_557239a3-e385-4e7b-95bc-5d6512571339/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
==================== Timed out after 60 seconds. ====================
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139844466358016)>

# Thread: <Thread(Thread-118, started daemon 139844734617344)>


======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
# Thread: <_MainThread(MainThread, started 139845253138176)>
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574904050.75_50427494-9ba0-4cd8-957b-e84174595caf failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 291.159s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 38s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...

Publishing failed.

The response from https://scans-in.gradle.com/in/5.2.1/2.3 was not from the build scan server.
Your network environment may be interfering, or the service may be unavailable.

If you believe this to be in error, please report this problem via https://gradle.com/scans/help/plugin and include the following via copy/paste:

----------
Gradle version: 5.2.1
Plugin version: 2.3
Request URL: https://scans-in.gradle.com/in/5.2.1/2.3
Request ID: e9b32a28-5c48-4b49-80c8-21ba9b0c5eda
Response status code: 503
Response content type: text/html; charset=utf-8
Response server type: cloudflare
----------

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1643

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1643/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/11/28 00:24:42 INFO sdk_worker_main.start: Status HTTP server running at localhost:37353
19/11/28 00:24:42 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 00:24:42 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 00:24:42 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574900679.75_82e20890-2ede-486f-9881-8e2e1c1f158a', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 00:24:42 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574900679.75', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49575', 'job_port': u'0'}
19/11/28 00:24:42 INFO statecache.__init__: Creating state cache with size 0
19/11/28 00:24:42 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38163.
19/11/28 00:24:42 INFO sdk_worker.__init__: Control channel established.
19/11/28 00:24:42 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/28 00:24:42 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 00:24:42 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38419.
19/11/28 00:24:42 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 00:24:42 INFO data_plane.create_data_channel: Creating client data channel for localhost:39403
19/11/28 00:24:42 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 00:24:42 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 00:24:42 INFO sdk_worker.run: No more requests from control plane
19/11/28 00:24:42 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 00:24:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 00:24:42 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 00:24:42 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 00:24:42 INFO sdk_worker.run: Done consuming work.
19/11/28 00:24:42 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 00:24:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 00:24:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 00:24:42 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 00:24:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 00:24:43 INFO sdk_worker_main.main: Logging handler created.
19/11/28 00:24:43 INFO sdk_worker_main.start: Status HTTP server running at localhost:35081
19/11/28 00:24:43 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 00:24:43 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 00:24:43 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574900679.75_82e20890-2ede-486f-9881-8e2e1c1f158a', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 00:24:43 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574900679.75', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49575', 'job_port': u'0'}
19/11/28 00:24:43 INFO statecache.__init__: Creating state cache with size 0
19/11/28 00:24:43 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35029.
19/11/28 00:24:43 INFO sdk_worker.__init__: Control channel established.
19/11/28 00:24:43 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 00:24:43 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/28 00:24:43 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41163.
19/11/28 00:24:43 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 00:24:43 INFO data_plane.create_data_channel: Creating client data channel for localhost:36973
19/11/28 00:24:43 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 00:24:43 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 00:24:43 INFO sdk_worker.run: No more requests from control plane
19/11/28 00:24:43 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 00:24:43 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 00:24:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 00:24:43 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 00:24:43 INFO sdk_worker.run: Done consuming work.
19/11/28 00:24:43 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 00:24:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 00:24:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 00:24:43 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 00:24:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 00:24:44 INFO sdk_worker_main.main: Logging handler created.
19/11/28 00:24:44 INFO sdk_worker_main.start: Status HTTP server running at localhost:38333
19/11/28 00:24:44 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 00:24:44 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 00:24:44 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574900679.75_82e20890-2ede-486f-9881-8e2e1c1f158a', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 00:24:44 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574900679.75', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49575', 'job_port': u'0'}
19/11/28 00:24:44 INFO statecache.__init__: Creating state cache with size 0
19/11/28 00:24:44 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40641.
19/11/28 00:24:44 INFO sdk_worker.__init__: Control channel established.
19/11/28 00:24:44 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 00:24:44 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/28 00:24:44 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38087.
19/11/28 00:24:44 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 00:24:44 INFO data_plane.create_data_channel: Creating client data channel for localhost:38195
19/11/28 00:24:44 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 00:24:44 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 00:24:44 INFO sdk_worker.run: No more requests from control plane
19/11/28 00:24:44 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 00:24:44 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 00:24:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 00:24:44 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 00:24:44 INFO sdk_worker.run: Done consuming work.
19/11/28 00:24:44 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 00:24:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 00:24:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 00:24:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 00:24:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 00:24:45 INFO sdk_worker_main.main: Logging handler created.
19/11/28 00:24:45 INFO sdk_worker_main.start: Status HTTP server running at localhost:36119
19/11/28 00:24:45 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 00:24:45 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 00:24:45 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574900679.75_82e20890-2ede-486f-9881-8e2e1c1f158a', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 00:24:45 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574900679.75', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49575', 'job_port': u'0'}
19/11/28 00:24:45 INFO statecache.__init__: Creating state cache with size 0
19/11/28 00:24:45 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44975.
19/11/28 00:24:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/28 00:24:45 INFO sdk_worker.__init__: Control channel established.
19/11/28 00:24:45 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 00:24:45 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35463.
19/11/28 00:24:45 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 00:24:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:42265
19/11/28 00:24:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 00:24:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 00:24:45 INFO sdk_worker.run: No more requests from control plane
19/11/28 00:24:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 00:24:45 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 00:24:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 00:24:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 00:24:45 INFO sdk_worker.run: Done consuming work.
19/11/28 00:24:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 00:24:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 00:24:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 00:24:45 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574900679.75_82e20890-2ede-486f-9881-8e2e1c1f158a finished.
19/11/28 00:24:45 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/28 00:24:45 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_ea82671f-5d7b-4086-a7e2-d8e91a1bbd60","basePath":"/tmp/sparktesthlH2f7"}: {}
java.io.FileNotFoundException: /tmp/sparktesthlH2f7/job_ea82671f-5d7b-4086-a7e2-d8e91a1bbd60/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
==================== Timed out after 60 seconds. ====================
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()

# Thread: <Thread(wait_until_finish_read, started daemon 139722815670016)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-117, started daemon 139722824062720)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
# Thread: <_MainThread(MainThread, started 139723603801856)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139722315781888)>

    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(Thread-123, started daemon 139722324174592)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-117, started daemon 139722824062720)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 139722815670016)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <_MainThread(MainThread, started 139723603801856)>
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574900670.01_e0d5e542-4f01-4a61-ad1f-9b3db37e1a04 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 309.085s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 50s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/lh4xlz3qqv6wi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1642

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1642/display/redirect?page=changes>

Changes:

[github] [BEAM-8840][BEAM-3713] Remove setup_requires, tests_require from


------------------------------------------
[...truncated 1.31 MB...]
19/11/27 21:36:36 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1574890595.83_2d1715c3-2926-46b5-99e6-6f5dd5359faa on Spark master local
19/11/27 21:36:36 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/11/27 21:36:36 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574890595.83_2d1715c3-2926-46b5-99e6-6f5dd5359faa: Pipeline translated successfully. Computing outputs
19/11/27 21:36:36 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 21:36:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 21:36:37 INFO sdk_worker_main.main: Logging handler created.
19/11/27 21:36:37 INFO sdk_worker_main.start: Status HTTP server running at localhost:43379
19/11/27 21:36:37 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 21:36:37 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 21:36:37 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574890595.83_2d1715c3-2926-46b5-99e6-6f5dd5359faa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 21:36:37 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574890595.83', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40005', 'job_port': u'0'}
19/11/27 21:36:37 INFO statecache.__init__: Creating state cache with size 0
19/11/27 21:36:37 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35007.
19/11/27 21:36:37 INFO sdk_worker.__init__: Control channel established.
19/11/27 21:36:37 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 21:36:37 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/11/27 21:36:37 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37341.
19/11/27 21:36:37 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 21:36:37 INFO data_plane.create_data_channel: Creating client data channel for localhost:34115
19/11/27 21:36:37 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 21:36:37 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 21:36:37 INFO sdk_worker.run: No more requests from control plane
19/11/27 21:36:37 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 21:36:37 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 21:36:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:36:37 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 21:36:37 INFO sdk_worker.run: Done consuming work.
19/11/27 21:36:37 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 21:36:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 21:36:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:36:37 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 21:36:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 21:36:38 INFO sdk_worker_main.main: Logging handler created.
19/11/27 21:36:38 INFO sdk_worker_main.start: Status HTTP server running at localhost:34201
19/11/27 21:36:38 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 21:36:38 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 21:36:38 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574890595.83_2d1715c3-2926-46b5-99e6-6f5dd5359faa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 21:36:38 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574890595.83', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40005', 'job_port': u'0'}
19/11/27 21:36:38 INFO statecache.__init__: Creating state cache with size 0
19/11/27 21:36:38 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40217.
19/11/27 21:36:38 INFO sdk_worker.__init__: Control channel established.
19/11/27 21:36:38 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 21:36:38 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/27 21:36:38 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43483.
19/11/27 21:36:38 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 21:36:38 INFO data_plane.create_data_channel: Creating client data channel for localhost:35257
19/11/27 21:36:38 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 21:36:38 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 21:36:38 INFO sdk_worker.run: No more requests from control plane
19/11/27 21:36:38 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 21:36:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:36:38 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 21:36:38 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 21:36:38 INFO sdk_worker.run: Done consuming work.
19/11/27 21:36:38 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 21:36:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 21:36:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:36:38 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 21:36:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 21:36:39 INFO sdk_worker_main.main: Logging handler created.
19/11/27 21:36:39 INFO sdk_worker_main.start: Status HTTP server running at localhost:38545
19/11/27 21:36:39 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 21:36:39 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 21:36:39 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574890595.83_2d1715c3-2926-46b5-99e6-6f5dd5359faa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 21:36:39 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574890595.83', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40005', 'job_port': u'0'}
19/11/27 21:36:39 INFO statecache.__init__: Creating state cache with size 0
19/11/27 21:36:39 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35047.
19/11/27 21:36:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/27 21:36:39 INFO sdk_worker.__init__: Control channel established.
19/11/27 21:36:39 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 21:36:39 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44247.
19/11/27 21:36:39 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 21:36:39 INFO data_plane.create_data_channel: Creating client data channel for localhost:46685
19/11/27 21:36:39 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 21:36:39 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 21:36:39 INFO sdk_worker.run: No more requests from control plane
19/11/27 21:36:39 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 21:36:39 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 21:36:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:36:39 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 21:36:39 INFO sdk_worker.run: Done consuming work.
19/11/27 21:36:39 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 21:36:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 21:36:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:36:39 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 21:36:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 21:36:39 INFO sdk_worker_main.main: Logging handler created.
19/11/27 21:36:39 INFO sdk_worker_main.start: Status HTTP server running at localhost:34973
19/11/27 21:36:39 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 21:36:39 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 21:36:39 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574890595.83_2d1715c3-2926-46b5-99e6-6f5dd5359faa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 21:36:39 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574890595.83', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40005', 'job_port': u'0'}
19/11/27 21:36:39 INFO statecache.__init__: Creating state cache with size 0
19/11/27 21:36:39 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35087.
19/11/27 21:36:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/27 21:36:39 INFO sdk_worker.__init__: Control channel established.
19/11/27 21:36:39 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 21:36:39 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41475.
19/11/27 21:36:39 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 21:36:39 INFO data_plane.create_data_channel: Creating client data channel for localhost:45697
19/11/27 21:36:39 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 21:36:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 21:36:40 INFO sdk_worker.run: No more requests from control plane
19/11/27 21:36:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 21:36:40 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 21:36:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:36:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 21:36:40 INFO sdk_worker.run: Done consuming work.
19/11/27 21:36:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 21:36:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 21:36:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:36:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 21:36:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 21:36:40 INFO sdk_worker_main.main: Logging handler created.
19/11/27 21:36:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:40441
19/11/27 21:36:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 21:36:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 21:36:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574890595.83_2d1715c3-2926-46b5-99e6-6f5dd5359faa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 21:36:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574890595.83', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40005', 'job_port': u'0'}
19/11/27 21:36:40 INFO statecache.__init__: Creating state cache with size 0
19/11/27 21:36:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39257.
19/11/27 21:36:40 INFO sdk_worker.__init__: Control channel established.
19/11/27 21:36:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/27 21:36:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 21:36:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38643.
19/11/27 21:36:40 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 21:36:40 INFO data_plane.create_data_channel: Creating client data channel for localhost:37091
19/11/27 21:36:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 21:36:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 21:36:40 INFO sdk_worker.run: No more requests from control plane
19/11/27 21:36:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 21:36:40 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 21:36:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:36:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 21:36:40 INFO sdk_worker.run: Done consuming work.
19/11/27 21:36:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 21:36:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 21:36:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:36:40 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574890595.83_2d1715c3-2926-46b5-99e6-6f5dd5359faa finished.
19/11/27 21:36:40 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/27 21:36:40 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_f72d2e2f-43c8-446a-938e-8a7c709cbac4","basePath":"/tmp/sparktestHxgVKm"}: {}
java.io.FileNotFoundException: /tmp/sparktestHxgVKm/job_f72d2e2f-43c8-446a-938e-8a7c709cbac4/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
==================== Timed out after 60 seconds. ====================
    _common.wait(self._state.condition.wait, _response_ready)

# Thread: <Thread(wait_until_finish_read, started daemon 139941151139584)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-117, started daemon 139941142746880)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 139941930587904)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574890586.85_359eff10-6d2c-4b18-8ecf-9716841a3781 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 281.439s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 17s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/iflcvv5vfb2go

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1641

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1641/display/redirect?page=changes>

Changes:

[amyrvold] [BEAM-8832] Allow GCS staging upload chunk size to be increased >1M when


------------------------------------------
[...truncated 1.31 MB...]
19/11/27 21:08:19 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1574888898.16_2f6b5dad-d8be-49ac-be2d-e7cac187169c on Spark master local
19/11/27 21:08:19 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/11/27 21:08:19 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574888898.16_2f6b5dad-d8be-49ac-be2d-e7cac187169c: Pipeline translated successfully. Computing outputs
19/11/27 21:08:19 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 21:08:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 21:08:19 INFO sdk_worker_main.main: Logging handler created.
19/11/27 21:08:19 INFO sdk_worker_main.start: Status HTTP server running at localhost:39307
19/11/27 21:08:19 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 21:08:19 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 21:08:19 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574888898.16_2f6b5dad-d8be-49ac-be2d-e7cac187169c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 21:08:19 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574888898.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41381', 'job_port': u'0'}
19/11/27 21:08:19 INFO statecache.__init__: Creating state cache with size 0
19/11/27 21:08:19 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39955.
19/11/27 21:08:19 INFO sdk_worker.__init__: Control channel established.
19/11/27 21:08:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/11/27 21:08:19 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 21:08:19 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34849.
19/11/27 21:08:19 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 21:08:19 INFO data_plane.create_data_channel: Creating client data channel for localhost:35721
19/11/27 21:08:19 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 21:08:19 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 21:08:19 INFO sdk_worker.run: No more requests from control plane
19/11/27 21:08:19 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 21:08:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:08:19 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 21:08:19 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 21:08:19 INFO sdk_worker.run: Done consuming work.
19/11/27 21:08:19 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 21:08:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 21:08:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:08:20 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 21:08:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 21:08:20 INFO sdk_worker_main.main: Logging handler created.
19/11/27 21:08:20 INFO sdk_worker_main.start: Status HTTP server running at localhost:35961
19/11/27 21:08:20 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 21:08:20 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 21:08:20 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574888898.16_2f6b5dad-d8be-49ac-be2d-e7cac187169c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 21:08:20 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574888898.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41381', 'job_port': u'0'}
19/11/27 21:08:20 INFO statecache.__init__: Creating state cache with size 0
19/11/27 21:08:20 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34315.
19/11/27 21:08:20 INFO sdk_worker.__init__: Control channel established.
19/11/27 21:08:20 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/27 21:08:20 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 21:08:20 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35491.
19/11/27 21:08:20 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 21:08:20 INFO data_plane.create_data_channel: Creating client data channel for localhost:43137
19/11/27 21:08:20 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 21:08:20 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 21:08:20 INFO sdk_worker.run: No more requests from control plane
19/11/27 21:08:20 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 21:08:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:08:20 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 21:08:20 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 21:08:20 INFO sdk_worker.run: Done consuming work.
19/11/27 21:08:20 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 21:08:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 21:08:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:08:20 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 21:08:21 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 21:08:21 INFO sdk_worker_main.main: Logging handler created.
19/11/27 21:08:21 INFO sdk_worker_main.start: Status HTTP server running at localhost:39041
19/11/27 21:08:21 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 21:08:21 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 21:08:21 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574888898.16_2f6b5dad-d8be-49ac-be2d-e7cac187169c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 21:08:21 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574888898.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41381', 'job_port': u'0'}
19/11/27 21:08:21 INFO statecache.__init__: Creating state cache with size 0
19/11/27 21:08:21 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37555.
19/11/27 21:08:21 INFO sdk_worker.__init__: Control channel established.
19/11/27 21:08:21 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 21:08:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/27 21:08:21 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35615.
19/11/27 21:08:21 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 21:08:21 INFO data_plane.create_data_channel: Creating client data channel for localhost:33027
19/11/27 21:08:21 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 21:08:21 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 21:08:21 INFO sdk_worker.run: No more requests from control plane
19/11/27 21:08:21 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 21:08:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:08:21 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 21:08:21 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 21:08:21 INFO sdk_worker.run: Done consuming work.
19/11/27 21:08:21 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 21:08:21 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 21:08:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:08:21 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 21:08:22 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 21:08:22 INFO sdk_worker_main.main: Logging handler created.
19/11/27 21:08:22 INFO sdk_worker_main.start: Status HTTP server running at localhost:33153
19/11/27 21:08:22 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 21:08:22 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 21:08:22 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574888898.16_2f6b5dad-d8be-49ac-be2d-e7cac187169c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 21:08:22 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574888898.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41381', 'job_port': u'0'}
19/11/27 21:08:22 INFO statecache.__init__: Creating state cache with size 0
19/11/27 21:08:22 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36967.
19/11/27 21:08:22 INFO sdk_worker.__init__: Control channel established.
19/11/27 21:08:22 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 21:08:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/27 21:08:22 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37871.
19/11/27 21:08:22 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 21:08:22 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 21:08:22 INFO data_plane.create_data_channel: Creating client data channel for localhost:35453
19/11/27 21:08:22 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 21:08:22 INFO sdk_worker.run: No more requests from control plane
19/11/27 21:08:22 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 21:08:22 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:08:22 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 21:08:22 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 21:08:22 INFO sdk_worker.run: Done consuming work.
19/11/27 21:08:22 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 21:08:22 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 21:08:22 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:08:22 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 21:08:23 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 21:08:23 INFO sdk_worker_main.main: Logging handler created.
19/11/27 21:08:23 INFO sdk_worker_main.start: Status HTTP server running at localhost:35099
19/11/27 21:08:23 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 21:08:23 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 21:08:23 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574888898.16_2f6b5dad-d8be-49ac-be2d-e7cac187169c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 21:08:23 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574888898.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41381', 'job_port': u'0'}
19/11/27 21:08:23 INFO statecache.__init__: Creating state cache with size 0
19/11/27 21:08:23 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45727.
19/11/27 21:08:23 INFO sdk_worker.__init__: Control channel established.
19/11/27 21:08:23 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 21:08:23 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/27 21:08:23 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42735.
19/11/27 21:08:23 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 21:08:23 INFO data_plane.create_data_channel: Creating client data channel for localhost:41493
19/11/27 21:08:23 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 21:08:23 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 21:08:23 INFO sdk_worker.run: No more requests from control plane
19/11/27 21:08:23 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 21:08:23 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:08:23 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 21:08:23 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 21:08:23 INFO sdk_worker.run: Done consuming work.
19/11/27 21:08:23 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 21:08:23 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 21:08:23 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:08:23 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574888898.16_2f6b5dad-d8be-49ac-be2d-e7cac187169c finished.
19/11/27 21:08:23 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/27 21:08:23 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_f5268c2a-b8ed-4a33-baf4-dd82831f332c","basePath":"/tmp/sparktestgJa7j9"}: {}
java.io.FileNotFoundException: /tmp/sparktestgJa7j9/job_f5268c2a-b8ed-4a33-baf4-dd82831f332c/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140558235772672)>

# Thread: <Thread(Thread-119, started daemon 140558218987264)>

# Thread: <_MainThread(MainThread, started 140559022552832)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574888889.07_ca793a48-37fa-49b6-8753-55eb43334f17 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 290.348s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 29s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/5puhpxlat2lty

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1640

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1640/display/redirect?page=changes>

Changes:

[github] Bump python precommit timeout to 3hrs


------------------------------------------
[...truncated 1.31 MB...]
19/11/27 19:21:44 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1574882503.37_891a5744-359f-4084-a700-2f616dfcfcd3 on Spark master local
19/11/27 19:21:44 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/11/27 19:21:44 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574882503.37_891a5744-359f-4084-a700-2f616dfcfcd3: Pipeline translated successfully. Computing outputs
19/11/27 19:21:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 19:21:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 19:21:45 INFO sdk_worker_main.main: Logging handler created.
19/11/27 19:21:45 INFO sdk_worker_main.start: Status HTTP server running at localhost:33585
19/11/27 19:21:45 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 19:21:45 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 19:21:45 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574882503.37_891a5744-359f-4084-a700-2f616dfcfcd3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 19:21:45 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574882503.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43037', 'job_port': u'0'}
19/11/27 19:21:45 INFO statecache.__init__: Creating state cache with size 0
19/11/27 19:21:45 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42671.
19/11/27 19:21:45 INFO sdk_worker.__init__: Control channel established.
19/11/27 19:21:45 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 19:21:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/11/27 19:21:45 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43543.
19/11/27 19:21:45 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 19:21:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:40521
19/11/27 19:21:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 19:21:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 19:21:45 INFO sdk_worker.run: No more requests from control plane
19/11/27 19:21:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 19:21:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 19:21:45 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 19:21:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 19:21:45 INFO sdk_worker.run: Done consuming work.
19/11/27 19:21:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 19:21:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 19:21:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 19:21:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 19:21:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 19:21:45 INFO sdk_worker_main.main: Logging handler created.
19/11/27 19:21:45 INFO sdk_worker_main.start: Status HTTP server running at localhost:43567
19/11/27 19:21:45 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 19:21:45 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 19:21:45 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574882503.37_891a5744-359f-4084-a700-2f616dfcfcd3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 19:21:45 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574882503.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43037', 'job_port': u'0'}
19/11/27 19:21:45 INFO statecache.__init__: Creating state cache with size 0
19/11/27 19:21:45 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46057.
19/11/27 19:21:45 INFO sdk_worker.__init__: Control channel established.
19/11/27 19:21:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/27 19:21:45 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 19:21:45 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46867.
19/11/27 19:21:45 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 19:21:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:46627
19/11/27 19:21:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 19:21:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 19:21:46 INFO sdk_worker.run: No more requests from control plane
19/11/27 19:21:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 19:21:46 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 19:21:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 19:21:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 19:21:46 INFO sdk_worker.run: Done consuming work.
19/11/27 19:21:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 19:21:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 19:21:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 19:21:46 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 19:21:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 19:21:46 INFO sdk_worker_main.main: Logging handler created.
19/11/27 19:21:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:41987
19/11/27 19:21:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 19:21:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 19:21:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574882503.37_891a5744-359f-4084-a700-2f616dfcfcd3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 19:21:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574882503.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43037', 'job_port': u'0'}
19/11/27 19:21:46 INFO statecache.__init__: Creating state cache with size 0
19/11/27 19:21:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44257.
19/11/27 19:21:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/27 19:21:46 INFO sdk_worker.__init__: Control channel established.
19/11/27 19:21:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 19:21:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38389.
19/11/27 19:21:46 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 19:21:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:43397
19/11/27 19:21:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 19:21:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 19:21:46 INFO sdk_worker.run: No more requests from control plane
19/11/27 19:21:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 19:21:46 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 19:21:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 19:21:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 19:21:46 INFO sdk_worker.run: Done consuming work.
19/11/27 19:21:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 19:21:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 19:21:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 19:21:47 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 19:21:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 19:21:47 INFO sdk_worker_main.main: Logging handler created.
19/11/27 19:21:47 INFO sdk_worker_main.start: Status HTTP server running at localhost:41757
19/11/27 19:21:47 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 19:21:47 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 19:21:47 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574882503.37_891a5744-359f-4084-a700-2f616dfcfcd3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 19:21:47 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574882503.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43037', 'job_port': u'0'}
19/11/27 19:21:47 INFO statecache.__init__: Creating state cache with size 0
19/11/27 19:21:47 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33851.
19/11/27 19:21:47 INFO sdk_worker.__init__: Control channel established.
19/11/27 19:21:47 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/27 19:21:47 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 19:21:47 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33091.
19/11/27 19:21:47 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 19:21:47 INFO data_plane.create_data_channel: Creating client data channel for localhost:42197
19/11/27 19:21:47 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 19:21:47 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 19:21:47 INFO sdk_worker.run: No more requests from control plane
19/11/27 19:21:47 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 19:21:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 19:21:47 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 19:21:47 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 19:21:47 INFO sdk_worker.run: Done consuming work.
19/11/27 19:21:47 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 19:21:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 19:21:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 19:21:47 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 19:21:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 19:21:48 INFO sdk_worker_main.main: Logging handler created.
19/11/27 19:21:48 INFO sdk_worker_main.start: Status HTTP server running at localhost:43493
19/11/27 19:21:48 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 19:21:48 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 19:21:48 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574882503.37_891a5744-359f-4084-a700-2f616dfcfcd3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 19:21:48 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574882503.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43037', 'job_port': u'0'}
19/11/27 19:21:48 INFO statecache.__init__: Creating state cache with size 0
19/11/27 19:21:48 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38923.
19/11/27 19:21:48 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/27 19:21:48 INFO sdk_worker.__init__: Control channel established.
19/11/27 19:21:48 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 19:21:48 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46733.
19/11/27 19:21:48 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 19:21:48 INFO data_plane.create_data_channel: Creating client data channel for localhost:39941
19/11/27 19:21:48 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 19:21:48 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 19:21:48 INFO sdk_worker.run: No more requests from control plane
19/11/27 19:21:48 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 19:21:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 19:21:48 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 19:21:48 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 19:21:48 INFO sdk_worker.run: Done consuming work.
19/11/27 19:21:48 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 19:21:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 19:21:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 19:21:48 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574882503.37_891a5744-359f-4084-a700-2f616dfcfcd3 finished.
19/11/27 19:21:48 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/27 19:21:48 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_751fd32a-68ba-4208-a568-0e1fd3946593","basePath":"/tmp/sparktestuhB3Us"}: {}
java.io.FileNotFoundException: /tmp/sparktestuhB3Us/job_751fd32a-68ba-4208-a568-0e1fd3946593/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140140168128256)>

# Thread: <Thread(Thread-117, started daemon 140139808417536)>

# Thread: <_MainThread(MainThread, started 140140950038272)>
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574882494.21_cc9e8cc7-0fb4-4f1e-a371-7620feae9e59 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 288.292s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 31s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/5ftv7xlogh7fe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1639

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1639/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/11/27 18:17:24 INFO sdk_worker_main.start: Status HTTP server running at localhost:37169
19/11/27 18:17:24 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 18:17:24 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 18:17:24 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574878641.53_27107844-74cf-4a96-9fdf-65925e7f6ebb', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 18:17:24 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574878641.53', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:47043', 'job_port': u'0'}
19/11/27 18:17:24 INFO statecache.__init__: Creating state cache with size 0
19/11/27 18:17:24 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45027.
19/11/27 18:17:24 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/27 18:17:24 INFO sdk_worker.__init__: Control channel established.
19/11/27 18:17:24 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 18:17:24 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36665.
19/11/27 18:17:24 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 18:17:24 INFO data_plane.create_data_channel: Creating client data channel for localhost:35847
19/11/27 18:17:24 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 18:17:24 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 18:17:24 INFO sdk_worker.run: No more requests from control plane
19/11/27 18:17:24 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 18:17:24 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 18:17:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 18:17:24 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 18:17:24 INFO sdk_worker.run: Done consuming work.
19/11/27 18:17:24 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 18:17:24 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 18:17:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 18:17:24 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 18:17:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 18:17:25 INFO sdk_worker_main.main: Logging handler created.
19/11/27 18:17:25 INFO sdk_worker_main.start: Status HTTP server running at localhost:44165
19/11/27 18:17:25 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 18:17:25 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 18:17:25 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574878641.53_27107844-74cf-4a96-9fdf-65925e7f6ebb', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 18:17:25 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574878641.53', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:47043', 'job_port': u'0'}
19/11/27 18:17:25 INFO statecache.__init__: Creating state cache with size 0
19/11/27 18:17:25 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39813.
19/11/27 18:17:25 INFO sdk_worker.__init__: Control channel established.
19/11/27 18:17:25 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/27 18:17:25 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 18:17:25 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38279.
19/11/27 18:17:25 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 18:17:25 INFO data_plane.create_data_channel: Creating client data channel for localhost:43777
19/11/27 18:17:25 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 18:17:25 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 18:17:25 INFO sdk_worker.run: No more requests from control plane
19/11/27 18:17:25 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 18:17:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 18:17:25 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 18:17:25 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 18:17:25 INFO sdk_worker.run: Done consuming work.
19/11/27 18:17:25 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 18:17:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 18:17:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 18:17:25 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 18:17:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 18:17:26 INFO sdk_worker_main.main: Logging handler created.
19/11/27 18:17:26 INFO sdk_worker_main.start: Status HTTP server running at localhost:43089
19/11/27 18:17:26 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 18:17:26 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 18:17:26 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574878641.53_27107844-74cf-4a96-9fdf-65925e7f6ebb', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 18:17:26 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574878641.53', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:47043', 'job_port': u'0'}
19/11/27 18:17:26 INFO statecache.__init__: Creating state cache with size 0
19/11/27 18:17:26 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39643.
19/11/27 18:17:26 INFO sdk_worker.__init__: Control channel established.
19/11/27 18:17:26 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/27 18:17:26 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 18:17:26 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39875.
19/11/27 18:17:26 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 18:17:26 INFO data_plane.create_data_channel: Creating client data channel for localhost:38185
19/11/27 18:17:26 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 18:17:26 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 18:17:26 INFO sdk_worker.run: No more requests from control plane
19/11/27 18:17:26 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 18:17:26 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 18:17:26 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 18:17:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 18:17:26 INFO sdk_worker.run: Done consuming work.
19/11/27 18:17:26 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 18:17:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 18:17:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 18:17:26 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 18:17:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 18:17:27 INFO sdk_worker_main.main: Logging handler created.
19/11/27 18:17:27 INFO sdk_worker_main.start: Status HTTP server running at localhost:46209
19/11/27 18:17:27 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 18:17:27 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 18:17:27 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574878641.53_27107844-74cf-4a96-9fdf-65925e7f6ebb', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 18:17:27 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574878641.53', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:47043', 'job_port': u'0'}
19/11/27 18:17:27 INFO statecache.__init__: Creating state cache with size 0
19/11/27 18:17:27 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37653.
19/11/27 18:17:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/27 18:17:27 INFO sdk_worker.__init__: Control channel established.
19/11/27 18:17:27 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 18:17:27 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38259.
19/11/27 18:17:27 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 18:17:27 INFO data_plane.create_data_channel: Creating client data channel for localhost:45893
19/11/27 18:17:27 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 18:17:27 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 18:17:27 INFO sdk_worker.run: No more requests from control plane
19/11/27 18:17:27 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 18:17:27 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 18:17:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 18:17:27 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 18:17:27 INFO sdk_worker.run: Done consuming work.
19/11/27 18:17:27 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 18:17:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 18:17:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 18:17:27 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574878641.53_27107844-74cf-4a96-9fdf-65925e7f6ebb finished.
19/11/27 18:17:27 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/27 18:17:27 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_d23c780e-adca-4061-9daa-070339f340a6","basePath":"/tmp/sparktestsOchFV"}: {}
java.io.FileNotFoundException: /tmp/sparktestsOchFV/job_d23c780e-adca-4061-9daa-070339f340a6/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139722964215552)>
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-119, started daemon 139722955822848)>

# Thread: <_MainThread(MainThread, started 139723746125568)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139722861045504)>

# Thread: <Thread(Thread-123, started daemon 139722869438208)>

# Thread: <_MainThread(MainThread, started 139723746125568)>

# Thread: <Thread(Thread-119, started daemon 139722955822848)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 139722964215552)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574878631.59_cb74ba48-70b2-4791-b26b-424c8f8b3a6d failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 312.334s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 57s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/3slrxck6qxx26

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1638

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1638/display/redirect>

Changes:


------------------------------------------
[...truncated 1.31 MB...]
19/11/27 12:12:44 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1574856763.17_cd9ac24b-231f-4937-9f49-414ef30729ec on Spark master local
19/11/27 12:12:44 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/11/27 12:12:44 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574856763.17_cd9ac24b-231f-4937-9f49-414ef30729ec: Pipeline translated successfully. Computing outputs
19/11/27 12:12:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 12:12:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 12:12:45 INFO sdk_worker_main.main: Logging handler created.
19/11/27 12:12:45 INFO sdk_worker_main.start: Status HTTP server running at localhost:34677
19/11/27 12:12:45 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 12:12:45 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 12:12:45 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574856763.17_cd9ac24b-231f-4937-9f49-414ef30729ec', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 12:12:45 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574856763.17', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60381', 'job_port': u'0'}
19/11/27 12:12:45 INFO statecache.__init__: Creating state cache with size 0
19/11/27 12:12:45 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34373.
19/11/27 12:12:45 INFO sdk_worker.__init__: Control channel established.
19/11/27 12:12:45 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 12:12:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/11/27 12:12:45 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33767.
19/11/27 12:12:45 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 12:12:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:34599
19/11/27 12:12:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 12:12:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 12:12:45 INFO sdk_worker.run: No more requests from control plane
19/11/27 12:12:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 12:12:45 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 12:12:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 12:12:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 12:12:45 INFO sdk_worker.run: Done consuming work.
19/11/27 12:12:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 12:12:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 12:12:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 12:12:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 12:12:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 12:12:46 INFO sdk_worker_main.main: Logging handler created.
19/11/27 12:12:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:38201
19/11/27 12:12:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 12:12:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 12:12:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574856763.17_cd9ac24b-231f-4937-9f49-414ef30729ec', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 12:12:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574856763.17', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60381', 'job_port': u'0'}
19/11/27 12:12:46 INFO statecache.__init__: Creating state cache with size 0
19/11/27 12:12:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41665.
19/11/27 12:12:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/27 12:12:46 INFO sdk_worker.__init__: Control channel established.
19/11/27 12:12:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 12:12:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35261.
19/11/27 12:12:46 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 12:12:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:46265
19/11/27 12:12:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 12:12:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 12:12:46 INFO sdk_worker.run: No more requests from control plane
19/11/27 12:12:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 12:12:46 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 12:12:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 12:12:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 12:12:46 INFO sdk_worker.run: Done consuming work.
19/11/27 12:12:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 12:12:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 12:12:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 12:12:46 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 12:12:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 12:12:47 INFO sdk_worker_main.main: Logging handler created.
19/11/27 12:12:47 INFO sdk_worker_main.start: Status HTTP server running at localhost:37223
19/11/27 12:12:47 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 12:12:47 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 12:12:47 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574856763.17_cd9ac24b-231f-4937-9f49-414ef30729ec', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 12:12:47 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574856763.17', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60381', 'job_port': u'0'}
19/11/27 12:12:47 INFO statecache.__init__: Creating state cache with size 0
19/11/27 12:12:47 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44145.
19/11/27 12:12:47 INFO sdk_worker.__init__: Control channel established.
19/11/27 12:12:47 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/27 12:12:47 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 12:12:47 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46137.
19/11/27 12:12:47 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 12:12:47 INFO data_plane.create_data_channel: Creating client data channel for localhost:46131
19/11/27 12:12:47 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 12:12:47 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 12:12:47 INFO sdk_worker.run: No more requests from control plane
19/11/27 12:12:47 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 12:12:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 12:12:47 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 12:12:47 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 12:12:47 INFO sdk_worker.run: Done consuming work.
19/11/27 12:12:47 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 12:12:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 12:12:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 12:12:47 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 12:12:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 12:12:48 INFO sdk_worker_main.main: Logging handler created.
19/11/27 12:12:48 INFO sdk_worker_main.start: Status HTTP server running at localhost:39895
19/11/27 12:12:48 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 12:12:48 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 12:12:48 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574856763.17_cd9ac24b-231f-4937-9f49-414ef30729ec', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 12:12:48 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574856763.17', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60381', 'job_port': u'0'}
19/11/27 12:12:48 INFO statecache.__init__: Creating state cache with size 0
19/11/27 12:12:48 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36641.
19/11/27 12:12:48 INFO sdk_worker.__init__: Control channel established.
19/11/27 12:12:48 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 12:12:48 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/27 12:12:48 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35045.
19/11/27 12:12:48 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 12:12:48 INFO data_plane.create_data_channel: Creating client data channel for localhost:41351
19/11/27 12:12:48 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 12:12:48 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 12:12:48 INFO sdk_worker.run: No more requests from control plane
19/11/27 12:12:48 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 12:12:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 12:12:48 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 12:12:48 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 12:12:48 INFO sdk_worker.run: Done consuming work.
19/11/27 12:12:48 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 12:12:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 12:12:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 12:12:48 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 12:12:49 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 12:12:49 INFO sdk_worker_main.main: Logging handler created.
19/11/27 12:12:49 INFO sdk_worker_main.start: Status HTTP server running at localhost:32913
19/11/27 12:12:49 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 12:12:49 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 12:12:49 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574856763.17_cd9ac24b-231f-4937-9f49-414ef30729ec', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 12:12:49 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574856763.17', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60381', 'job_port': u'0'}
19/11/27 12:12:49 INFO statecache.__init__: Creating state cache with size 0
19/11/27 12:12:49 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45329.
19/11/27 12:12:49 INFO sdk_worker.__init__: Control channel established.
19/11/27 12:12:49 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 12:12:49 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/27 12:12:49 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44791.
19/11/27 12:12:49 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 12:12:49 INFO data_plane.create_data_channel: Creating client data channel for localhost:36881
19/11/27 12:12:49 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 12:12:49 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 12:12:49 INFO sdk_worker.run: No more requests from control plane
19/11/27 12:12:49 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 12:12:49 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 12:12:49 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 12:12:49 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 12:12:49 INFO sdk_worker.run: Done consuming work.
19/11/27 12:12:49 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 12:12:49 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 12:12:49 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 12:12:49 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574856763.17_cd9ac24b-231f-4937-9f49-414ef30729ec finished.
19/11/27 12:12:49 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/27 12:12:49 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_67042f10-937e-47bc-a621-bf08e882644c","basePath":"/tmp/sparktesthUpP7J"}: {}
java.io.FileNotFoundException: /tmp/sparktesthUpP7J/job_67042f10-937e-47bc-a621-bf08e882644c/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 140071634204416)>

======================================================================
# Thread: <Thread(Thread-119, started daemon 140071625811712)>
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)

# Thread: <_MainThread(MainThread, started 140072495224576)>
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574856751.86_11a8bf6d-283d-4fb7-8081-dfa792e92529 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 307.614s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 48s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/yomtenbhvhx6g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1637

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1637/display/redirect?page=changes>

Changes:

[aromanenko.dev] [BEAM-8470] Exclude failed ValidatesRunner tests


------------------------------------------
[...truncated 1.32 MB...]
19/11/27 09:11:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 09:11:41 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 09:11:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 09:11:42 INFO sdk_worker_main.main: Logging handler created.
19/11/27 09:11:42 INFO sdk_worker_main.start: Status HTTP server running at localhost:34781
19/11/27 09:11:42 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 09:11:42 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 09:11:42 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574845899.59_04729cc3-758d-4bd5-9679-7b2d6344d85c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 09:11:42 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574845899.59', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44175', 'job_port': u'0'}
19/11/27 09:11:42 INFO statecache.__init__: Creating state cache with size 0
19/11/27 09:11:42 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39415.
19/11/27 09:11:42 INFO sdk_worker.__init__: Control channel established.
19/11/27 09:11:42 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 09:11:42 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/27 09:11:42 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42703.
19/11/27 09:11:42 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 09:11:42 INFO data_plane.create_data_channel: Creating client data channel for localhost:35661
19/11/27 09:11:42 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 09:11:42 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 09:11:42 INFO sdk_worker.run: No more requests from control plane
19/11/27 09:11:42 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 09:11:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 09:11:42 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 09:11:42 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 09:11:42 INFO sdk_worker.run: Done consuming work.
19/11/27 09:11:42 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 09:11:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 09:11:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 09:11:42 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 09:11:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 09:11:43 INFO sdk_worker_main.main: Logging handler created.
19/11/27 09:11:43 INFO sdk_worker_main.start: Status HTTP server running at localhost:35321
19/11/27 09:11:43 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 09:11:43 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 09:11:43 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574845899.59_04729cc3-758d-4bd5-9679-7b2d6344d85c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 09:11:43 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574845899.59', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44175', 'job_port': u'0'}
19/11/27 09:11:43 INFO statecache.__init__: Creating state cache with size 0
19/11/27 09:11:43 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44535.
19/11/27 09:11:43 INFO sdk_worker.__init__: Control channel established.
19/11/27 09:11:43 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/27 09:11:43 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 09:11:43 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41019.
19/11/27 09:11:43 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 09:11:43 INFO data_plane.create_data_channel: Creating client data channel for localhost:45101
19/11/27 09:11:43 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 09:11:43 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 09:11:43 INFO sdk_worker.run: No more requests from control plane
19/11/27 09:11:43 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 09:11:43 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 09:11:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 09:11:43 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 09:11:43 INFO sdk_worker.run: Done consuming work.
19/11/27 09:11:43 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 09:11:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 09:11:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 09:11:43 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 09:11:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 09:11:44 INFO sdk_worker_main.main: Logging handler created.
19/11/27 09:11:44 INFO sdk_worker_main.start: Status HTTP server running at localhost:35757
19/11/27 09:11:44 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 09:11:44 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 09:11:44 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574845899.59_04729cc3-758d-4bd5-9679-7b2d6344d85c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 09:11:44 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574845899.59', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44175', 'job_port': u'0'}
19/11/27 09:11:44 INFO statecache.__init__: Creating state cache with size 0
19/11/27 09:11:44 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36847.
19/11/27 09:11:44 INFO sdk_worker.__init__: Control channel established.
19/11/27 09:11:44 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 09:11:44 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/27 09:11:44 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38115.
19/11/27 09:11:44 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 09:11:44 INFO data_plane.create_data_channel: Creating client data channel for localhost:35125
19/11/27 09:11:44 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 09:11:44 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 09:11:44 INFO sdk_worker.run: No more requests from control plane
19/11/27 09:11:44 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 09:11:44 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 09:11:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 09:11:44 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 09:11:44 INFO sdk_worker.run: Done consuming work.
19/11/27 09:11:44 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 09:11:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 09:11:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 09:11:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 09:11:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 09:11:44 INFO sdk_worker_main.main: Logging handler created.
19/11/27 09:11:44 INFO sdk_worker_main.start: Status HTTP server running at localhost:39573
19/11/27 09:11:44 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 09:11:44 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 09:11:44 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574845899.59_04729cc3-758d-4bd5-9679-7b2d6344d85c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 09:11:44 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574845899.59', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44175', 'job_port': u'0'}
19/11/27 09:11:44 INFO statecache.__init__: Creating state cache with size 0
19/11/27 09:11:44 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33195.
19/11/27 09:11:44 INFO sdk_worker.__init__: Control channel established.
19/11/27 09:11:44 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/27 09:11:45 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 09:11:45 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34841.
19/11/27 09:11:45 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 09:11:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:34321
19/11/27 09:11:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 09:11:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 09:11:45 INFO sdk_worker.run: No more requests from control plane
19/11/27 09:11:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 09:11:45 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 09:11:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 09:11:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 09:11:45 INFO sdk_worker.run: Done consuming work.
19/11/27 09:11:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 09:11:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 09:11:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 09:11:45 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574845899.59_04729cc3-758d-4bd5-9679-7b2d6344d85c finished.
19/11/27 09:11:45 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/27 09:11:45 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_92bf4d4d-91f4-4edc-a642-a4f396b9a00a","basePath":"/tmp/sparktestAnP_bS"}: {}
java.io.FileNotFoundException: /tmp/sparktestAnP_bS/job_92bf4d4d-91f4-4edc-a642-a4f396b9a00a/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140051493152512)>

    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-119, started daemon 140051577751296)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140052357199616)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140051475842816)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-125, started daemon 140051484497664)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140052357199616)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574845889.71_7a0355b2-6632-4d2d-a05b-82152cb681d2 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 310.837s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 4s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/fa44qpnkwi6pg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1636

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1636/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/11/27 06:19:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 06:19:47 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 06:19:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 06:19:48 INFO sdk_worker_main.main: Logging handler created.
19/11/27 06:19:48 INFO sdk_worker_main.start: Status HTTP server running at localhost:39025
19/11/27 06:19:48 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 06:19:48 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 06:19:48 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574835586.17_32374364-5bbc-444f-a125-f699d3afe24f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 06:19:48 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574835586.17', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49593', 'job_port': u'0'}
19/11/27 06:19:48 INFO statecache.__init__: Creating state cache with size 0
19/11/27 06:19:48 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45673.
19/11/27 06:19:48 INFO sdk_worker.__init__: Control channel established.
19/11/27 06:19:48 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 06:19:48 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/27 06:19:48 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38925.
19/11/27 06:19:48 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 06:19:48 INFO data_plane.create_data_channel: Creating client data channel for localhost:44269
19/11/27 06:19:48 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 06:19:49 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 06:19:49 INFO sdk_worker.run: No more requests from control plane
19/11/27 06:19:49 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 06:19:49 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 06:19:49 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 06:19:49 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 06:19:49 INFO sdk_worker.run: Done consuming work.
19/11/27 06:19:49 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 06:19:49 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 06:19:49 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 06:19:49 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 06:19:49 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 06:19:49 INFO sdk_worker_main.main: Logging handler created.
19/11/27 06:19:49 INFO sdk_worker_main.start: Status HTTP server running at localhost:37517
19/11/27 06:19:49 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 06:19:49 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 06:19:49 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574835586.17_32374364-5bbc-444f-a125-f699d3afe24f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 06:19:49 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574835586.17', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49593', 'job_port': u'0'}
19/11/27 06:19:49 INFO statecache.__init__: Creating state cache with size 0
19/11/27 06:19:49 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44249.
19/11/27 06:19:49 INFO sdk_worker.__init__: Control channel established.
19/11/27 06:19:49 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 06:19:49 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/27 06:19:49 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45855.
19/11/27 06:19:49 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 06:19:49 INFO data_plane.create_data_channel: Creating client data channel for localhost:36305
19/11/27 06:19:49 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 06:19:49 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 06:19:49 INFO sdk_worker.run: No more requests from control plane
19/11/27 06:19:49 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 06:19:49 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 06:19:49 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 06:19:49 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 06:19:49 INFO sdk_worker.run: Done consuming work.
19/11/27 06:19:49 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 06:19:49 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 06:19:49 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 06:19:49 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 06:19:50 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 06:19:50 INFO sdk_worker_main.main: Logging handler created.
19/11/27 06:19:50 INFO sdk_worker_main.start: Status HTTP server running at localhost:45343
19/11/27 06:19:50 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 06:19:50 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 06:19:50 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574835586.17_32374364-5bbc-444f-a125-f699d3afe24f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 06:19:50 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574835586.17', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49593', 'job_port': u'0'}
19/11/27 06:19:50 INFO statecache.__init__: Creating state cache with size 0
19/11/27 06:19:50 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46561.
19/11/27 06:19:50 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/27 06:19:50 INFO sdk_worker.__init__: Control channel established.
19/11/27 06:19:50 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 06:19:50 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36283.
19/11/27 06:19:50 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 06:19:50 INFO data_plane.create_data_channel: Creating client data channel for localhost:38449
19/11/27 06:19:50 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 06:19:50 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 06:19:50 INFO sdk_worker.run: No more requests from control plane
19/11/27 06:19:50 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 06:19:50 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 06:19:50 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 06:19:50 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 06:19:50 INFO sdk_worker.run: Done consuming work.
19/11/27 06:19:50 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 06:19:50 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 06:19:50 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 06:19:50 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 06:19:51 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 06:19:51 INFO sdk_worker_main.main: Logging handler created.
19/11/27 06:19:51 INFO sdk_worker_main.start: Status HTTP server running at localhost:43807
19/11/27 06:19:51 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 06:19:51 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 06:19:51 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574835586.17_32374364-5bbc-444f-a125-f699d3afe24f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 06:19:51 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574835586.17', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49593', 'job_port': u'0'}
19/11/27 06:19:51 INFO statecache.__init__: Creating state cache with size 0
19/11/27 06:19:51 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39155.
19/11/27 06:19:51 INFO sdk_worker.__init__: Control channel established.
19/11/27 06:19:51 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/27 06:19:51 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 06:19:51 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46167.
19/11/27 06:19:51 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 06:19:51 INFO data_plane.create_data_channel: Creating client data channel for localhost:36131
19/11/27 06:19:51 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 06:19:51 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 06:19:51 INFO sdk_worker.run: No more requests from control plane
19/11/27 06:19:51 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 06:19:51 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 06:19:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 06:19:51 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 06:19:51 INFO sdk_worker.run: Done consuming work.
19/11/27 06:19:51 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 06:19:51 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 06:19:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 06:19:51 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574835586.17_32374364-5bbc-444f-a125-f699d3afe24f finished.
19/11/27 06:19:51 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/27 06:19:51 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_89b7e81c-acae-4f1d-b18b-5a4b38f11fbf","basePath":"/tmp/sparktestBElStw"}: {}
java.io.FileNotFoundException: /tmp/sparktestBElStw/job_89b7e81c-acae-4f1d-b18b-5a4b38f11fbf/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
==================== Timed out after 60 seconds. ====================
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)

# Thread: <Thread(wait_until_finish_read, started daemon 139884032538368)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
# Thread: <Thread(Thread-119, started daemon 139884015752960)>

# Thread: <_MainThread(MainThread, started 139884814448384)>
==================== Timed out after 60 seconds. ====================

BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139883393840896)>

# Thread: <Thread(Thread-125, started daemon 139884007360256)>

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
# Thread: <_MainThread(MainThread, started 139884814448384)>
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574835576.58_d351b1c6-ffa7-4ca6-8036-512795e2e32e failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 298.384s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 33s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/u6krsw5yaw6oe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1635

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1635/display/redirect?page=changes>

Changes:

[aaltay] [BEAM-7390] Add code snippets for Count (#9923)

[aaltay] [BEAM-7390] Add code snippets for CombineGlobally (#9920)

[aaltay] [BEAM-7390] Add code snippets for CombineValues (#9922)

[aaltay] Fix sorting order bug. (#9883)


------------------------------------------
[...truncated 1.31 MB...]
19/11/27 02:02:40 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1574820159.28_d050456c-0004-4029-90c8-ecc41d0e8790 on Spark master local
19/11/27 02:02:40 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/11/27 02:02:40 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574820159.28_d050456c-0004-4029-90c8-ecc41d0e8790: Pipeline translated successfully. Computing outputs
19/11/27 02:02:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 02:02:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 02:02:40 INFO sdk_worker_main.main: Logging handler created.
19/11/27 02:02:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:35153
19/11/27 02:02:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 02:02:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 02:02:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574820159.28_d050456c-0004-4029-90c8-ecc41d0e8790', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 02:02:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574820159.28', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:45491', 'job_port': u'0'}
19/11/27 02:02:40 INFO statecache.__init__: Creating state cache with size 0
19/11/27 02:02:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37599.
19/11/27 02:02:40 INFO sdk_worker.__init__: Control channel established.
19/11/27 02:02:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/11/27 02:02:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 02:02:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33409.
19/11/27 02:02:40 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 02:02:41 INFO data_plane.create_data_channel: Creating client data channel for localhost:41287
19/11/27 02:02:41 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 02:02:41 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 02:02:41 INFO sdk_worker.run: No more requests from control plane
19/11/27 02:02:41 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 02:02:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 02:02:41 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 02:02:41 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 02:02:41 INFO sdk_worker.run: Done consuming work.
19/11/27 02:02:41 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 02:02:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 02:02:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 02:02:41 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 02:02:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 02:02:41 INFO sdk_worker_main.main: Logging handler created.
19/11/27 02:02:41 INFO sdk_worker_main.start: Status HTTP server running at localhost:43983
19/11/27 02:02:41 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 02:02:41 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 02:02:41 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574820159.28_d050456c-0004-4029-90c8-ecc41d0e8790', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 02:02:41 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574820159.28', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:45491', 'job_port': u'0'}
19/11/27 02:02:41 INFO statecache.__init__: Creating state cache with size 0
19/11/27 02:02:41 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45603.
19/11/27 02:02:41 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/27 02:02:41 INFO sdk_worker.__init__: Control channel established.
19/11/27 02:02:41 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 02:02:41 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38699.
19/11/27 02:02:41 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 02:02:41 INFO data_plane.create_data_channel: Creating client data channel for localhost:37881
19/11/27 02:02:41 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 02:02:41 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 02:02:41 INFO sdk_worker.run: No more requests from control plane
19/11/27 02:02:41 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 02:02:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 02:02:41 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 02:02:41 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 02:02:41 INFO sdk_worker.run: Done consuming work.
19/11/27 02:02:41 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 02:02:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 02:02:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 02:02:42 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 02:02:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 02:02:42 INFO sdk_worker_main.main: Logging handler created.
19/11/27 02:02:42 INFO sdk_worker_main.start: Status HTTP server running at localhost:45947
19/11/27 02:02:42 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 02:02:42 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 02:02:42 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574820159.28_d050456c-0004-4029-90c8-ecc41d0e8790', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 02:02:42 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574820159.28', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:45491', 'job_port': u'0'}
19/11/27 02:02:42 INFO statecache.__init__: Creating state cache with size 0
19/11/27 02:02:42 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44405.
19/11/27 02:02:42 INFO sdk_worker.__init__: Control channel established.
19/11/27 02:02:42 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/27 02:02:42 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 02:02:42 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40483.
19/11/27 02:02:42 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 02:02:42 INFO data_plane.create_data_channel: Creating client data channel for localhost:43347
19/11/27 02:02:42 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 02:02:42 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 02:02:42 INFO sdk_worker.run: No more requests from control plane
19/11/27 02:02:42 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 02:02:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 02:02:42 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 02:02:42 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 02:02:42 INFO sdk_worker.run: Done consuming work.
19/11/27 02:02:42 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 02:02:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 02:02:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 02:02:42 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 02:02:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 02:02:43 INFO sdk_worker_main.main: Logging handler created.
19/11/27 02:02:43 INFO sdk_worker_main.start: Status HTTP server running at localhost:46271
19/11/27 02:02:43 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 02:02:43 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 02:02:43 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574820159.28_d050456c-0004-4029-90c8-ecc41d0e8790', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 02:02:43 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574820159.28', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:45491', 'job_port': u'0'}
19/11/27 02:02:43 INFO statecache.__init__: Creating state cache with size 0
19/11/27 02:02:43 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46519.
19/11/27 02:02:43 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/27 02:02:43 INFO sdk_worker.__init__: Control channel established.
19/11/27 02:02:43 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 02:02:43 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34085.
19/11/27 02:02:43 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 02:02:43 INFO data_plane.create_data_channel: Creating client data channel for localhost:40701
19/11/27 02:02:43 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 02:02:43 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 02:02:43 INFO sdk_worker.run: No more requests from control plane
19/11/27 02:02:43 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 02:02:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 02:02:43 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 02:02:43 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 02:02:43 INFO sdk_worker.run: Done consuming work.
19/11/27 02:02:43 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 02:02:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 02:02:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 02:02:43 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 02:02:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 02:02:44 INFO sdk_worker_main.main: Logging handler created.
19/11/27 02:02:44 INFO sdk_worker_main.start: Status HTTP server running at localhost:46439
19/11/27 02:02:44 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 02:02:44 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 02:02:44 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574820159.28_d050456c-0004-4029-90c8-ecc41d0e8790', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 02:02:44 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574820159.28', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:45491', 'job_port': u'0'}
19/11/27 02:02:44 INFO statecache.__init__: Creating state cache with size 0
19/11/27 02:02:44 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34815.
19/11/27 02:02:44 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/27 02:02:44 INFO sdk_worker.__init__: Control channel established.
19/11/27 02:02:44 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 02:02:44 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42857.
19/11/27 02:02:44 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 02:02:44 INFO data_plane.create_data_channel: Creating client data channel for localhost:40013
19/11/27 02:02:44 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 02:02:44 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 02:02:44 INFO sdk_worker.run: No more requests from control plane
19/11/27 02:02:44 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 02:02:44 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 02:02:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 02:02:44 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 02:02:44 INFO sdk_worker.run: Done consuming work.
19/11/27 02:02:44 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 02:02:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 02:02:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 02:02:44 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574820159.28_d050456c-0004-4029-90c8-ecc41d0e8790 finished.
19/11/27 02:02:44 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/27 02:02:44 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_84c6fcb7-ed28-47ca-88fe-907f2164c508","basePath":"/tmp/sparktestLvQ5zZ"}: {}
java.io.FileNotFoundException: /tmp/sparktestLvQ5zZ/job_84c6fcb7-ed28-47ca-88fe-907f2164c508/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574820150.13_9202a452-c727-4c63-93c5-3c34c4e4c94a failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140059464742656)>

----------------------------------------------------------------------
Ran 38 tests in 293.167s

FAILED (errors=2, skipped=9)
# Thread: <Thread(Thread-120, started daemon 140059456349952)>

# Thread: <_MainThread(MainThread, started 140060259915520)>

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 11m 39s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/m6ityq6fe32dk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1634

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1634/display/redirect>

Changes:


------------------------------------------
[...truncated 1.31 MB...]
19/11/27 00:19:57 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1574813996.88_3852a7f0-07fc-480e-a1bc-c581a09d9949 on Spark master local
19/11/27 00:19:57 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/11/27 00:19:57 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574813996.88_3852a7f0-07fc-480e-a1bc-c581a09d9949: Pipeline translated successfully. Computing outputs
19/11/27 00:19:57 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 00:19:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 00:19:58 INFO sdk_worker_main.main: Logging handler created.
19/11/27 00:19:58 INFO sdk_worker_main.start: Status HTTP server running at localhost:42799
19/11/27 00:19:58 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 00:19:58 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 00:19:58 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574813996.88_3852a7f0-07fc-480e-a1bc-c581a09d9949', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 00:19:58 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574813996.88', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44171', 'job_port': u'0'}
19/11/27 00:19:58 INFO statecache.__init__: Creating state cache with size 0
19/11/27 00:19:58 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39549.
19/11/27 00:19:58 INFO sdk_worker.__init__: Control channel established.
19/11/27 00:19:58 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 00:19:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/11/27 00:19:58 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42099.
19/11/27 00:19:58 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 00:19:58 INFO data_plane.create_data_channel: Creating client data channel for localhost:35621
19/11/27 00:19:58 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 00:19:58 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 00:19:58 INFO sdk_worker.run: No more requests from control plane
19/11/27 00:19:58 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 00:19:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 00:19:58 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 00:19:58 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 00:19:58 INFO sdk_worker.run: Done consuming work.
19/11/27 00:19:58 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 00:19:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 00:19:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 00:19:58 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 00:19:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 00:19:59 INFO sdk_worker_main.main: Logging handler created.
19/11/27 00:19:59 INFO sdk_worker_main.start: Status HTTP server running at localhost:41413
19/11/27 00:19:59 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 00:19:59 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 00:19:59 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574813996.88_3852a7f0-07fc-480e-a1bc-c581a09d9949', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 00:19:59 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574813996.88', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44171', 'job_port': u'0'}
19/11/27 00:19:59 INFO statecache.__init__: Creating state cache with size 0
19/11/27 00:19:59 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45611.
19/11/27 00:19:59 INFO sdk_worker.__init__: Control channel established.
19/11/27 00:19:59 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 00:19:59 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/27 00:19:59 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38737.
19/11/27 00:19:59 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 00:19:59 INFO data_plane.create_data_channel: Creating client data channel for localhost:37199
19/11/27 00:19:59 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 00:19:59 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 00:19:59 INFO sdk_worker.run: No more requests from control plane
19/11/27 00:19:59 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 00:19:59 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 00:19:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 00:19:59 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 00:19:59 INFO sdk_worker.run: Done consuming work.
19/11/27 00:19:59 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 00:19:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 00:19:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 00:19:59 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 00:20:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 00:20:00 INFO sdk_worker_main.main: Logging handler created.
19/11/27 00:20:00 INFO sdk_worker_main.start: Status HTTP server running at localhost:41525
19/11/27 00:20:00 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 00:20:00 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 00:20:00 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574813996.88_3852a7f0-07fc-480e-a1bc-c581a09d9949', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 00:20:00 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574813996.88', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44171', 'job_port': u'0'}
19/11/27 00:20:00 INFO statecache.__init__: Creating state cache with size 0
19/11/27 00:20:00 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43349.
19/11/27 00:20:00 INFO sdk_worker.__init__: Control channel established.
19/11/27 00:20:00 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 00:20:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/27 00:20:00 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44705.
19/11/27 00:20:00 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 00:20:00 INFO data_plane.create_data_channel: Creating client data channel for localhost:40319
19/11/27 00:20:00 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 00:20:00 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 00:20:00 INFO sdk_worker.run: No more requests from control plane
19/11/27 00:20:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 00:20:00 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 00:20:00 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 00:20:00 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 00:20:00 INFO sdk_worker.run: Done consuming work.
19/11/27 00:20:00 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 00:20:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 00:20:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 00:20:00 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 00:20:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 00:20:01 INFO sdk_worker_main.main: Logging handler created.
19/11/27 00:20:01 INFO sdk_worker_main.start: Status HTTP server running at localhost:38693
19/11/27 00:20:01 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 00:20:01 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 00:20:01 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574813996.88_3852a7f0-07fc-480e-a1bc-c581a09d9949', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 00:20:01 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574813996.88', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44171', 'job_port': u'0'}
19/11/27 00:20:01 INFO statecache.__init__: Creating state cache with size 0
19/11/27 00:20:01 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40735.
19/11/27 00:20:01 INFO sdk_worker.__init__: Control channel established.
19/11/27 00:20:01 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 00:20:01 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/27 00:20:01 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45307.
19/11/27 00:20:01 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 00:20:01 INFO data_plane.create_data_channel: Creating client data channel for localhost:36653
19/11/27 00:20:01 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 00:20:01 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 00:20:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 00:20:01 INFO sdk_worker.run: No more requests from control plane
19/11/27 00:20:01 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 00:20:01 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 00:20:01 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 00:20:01 INFO sdk_worker.run: Done consuming work.
19/11/27 00:20:01 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 00:20:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 00:20:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 00:20:01 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 00:20:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 00:20:02 INFO sdk_worker_main.main: Logging handler created.
19/11/27 00:20:02 INFO sdk_worker_main.start: Status HTTP server running at localhost:35515
19/11/27 00:20:02 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 00:20:02 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 00:20:02 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574813996.88_3852a7f0-07fc-480e-a1bc-c581a09d9949', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 00:20:02 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574813996.88', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44171', 'job_port': u'0'}
19/11/27 00:20:02 INFO statecache.__init__: Creating state cache with size 0
19/11/27 00:20:02 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44231.
19/11/27 00:20:02 INFO sdk_worker.__init__: Control channel established.
19/11/27 00:20:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/27 00:20:02 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 00:20:02 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40867.
19/11/27 00:20:02 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 00:20:02 INFO data_plane.create_data_channel: Creating client data channel for localhost:42137
19/11/27 00:20:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 00:20:02 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 00:20:02 INFO sdk_worker.run: No more requests from control plane
19/11/27 00:20:02 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 00:20:02 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 00:20:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 00:20:02 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 00:20:02 INFO sdk_worker.run: Done consuming work.
19/11/27 00:20:02 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 00:20:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 00:20:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 00:20:02 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574813996.88_3852a7f0-07fc-480e-a1bc-c581a09d9949 finished.
19/11/27 00:20:02 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/27 00:20:02 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_339cbc4b-1b82-49ed-9760-dc29ab2c1633","basePath":"/tmp/sparktestvgC75n"}: {}
java.io.FileNotFoundException: /tmp/sparktestvgC75n/job_339cbc4b-1b82-49ed-9760-dc29ab2c1633/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140047058089728)>

Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
# Thread: <Thread(Thread-118, started daemon 140047049697024)>

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <_MainThread(MainThread, started 140047839999744)>
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574813987.29_6bef242d-f018-4d69-8726-3a5fcb49baa6 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 306.483s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 49s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/zv5flclpq2636

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1633

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1633/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/11/26 18:24:38 INFO sdk_worker_main.start: Status HTTP server running at localhost:32817
19/11/26 18:24:38 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 18:24:38 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 18:24:38 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574792675.76_3de1076c-6b3d-4a7e-a3c1-d142805cb3b6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 18:24:38 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574792675.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59823', 'job_port': u'0'}
19/11/26 18:24:38 INFO statecache.__init__: Creating state cache with size 0
19/11/26 18:24:38 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40621.
19/11/26 18:24:38 INFO sdk_worker.__init__: Control channel established.
19/11/26 18:24:38 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/26 18:24:38 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 18:24:38 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40649.
19/11/26 18:24:38 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 18:24:38 INFO data_plane.create_data_channel: Creating client data channel for localhost:34243
19/11/26 18:24:38 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 18:24:38 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 18:24:38 INFO sdk_worker.run: No more requests from control plane
19/11/26 18:24:38 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 18:24:38 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 18:24:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 18:24:38 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 18:24:38 INFO sdk_worker.run: Done consuming work.
19/11/26 18:24:38 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 18:24:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 18:24:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 18:24:38 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 18:24:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 18:24:39 INFO sdk_worker_main.main: Logging handler created.
19/11/26 18:24:39 INFO sdk_worker_main.start: Status HTTP server running at localhost:37383
19/11/26 18:24:39 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 18:24:39 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 18:24:39 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574792675.76_3de1076c-6b3d-4a7e-a3c1-d142805cb3b6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 18:24:39 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574792675.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59823', 'job_port': u'0'}
19/11/26 18:24:39 INFO statecache.__init__: Creating state cache with size 0
19/11/26 18:24:39 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35947.
19/11/26 18:24:39 INFO sdk_worker.__init__: Control channel established.
19/11/26 18:24:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/26 18:24:39 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 18:24:39 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35705.
19/11/26 18:24:39 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 18:24:39 INFO data_plane.create_data_channel: Creating client data channel for localhost:43591
19/11/26 18:24:39 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 18:24:39 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 18:24:39 INFO sdk_worker.run: No more requests from control plane
19/11/26 18:24:39 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 18:24:39 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 18:24:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 18:24:39 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 18:24:39 INFO sdk_worker.run: Done consuming work.
19/11/26 18:24:39 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 18:24:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 18:24:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 18:24:39 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 18:24:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 18:24:39 INFO sdk_worker_main.main: Logging handler created.
19/11/26 18:24:39 INFO sdk_worker_main.start: Status HTTP server running at localhost:38005
19/11/26 18:24:39 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 18:24:39 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 18:24:39 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574792675.76_3de1076c-6b3d-4a7e-a3c1-d142805cb3b6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 18:24:39 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574792675.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59823', 'job_port': u'0'}
19/11/26 18:24:39 INFO statecache.__init__: Creating state cache with size 0
19/11/26 18:24:39 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45711.
19/11/26 18:24:39 INFO sdk_worker.__init__: Control channel established.
19/11/26 18:24:39 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 18:24:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/26 18:24:39 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33819.
19/11/26 18:24:39 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 18:24:39 INFO data_plane.create_data_channel: Creating client data channel for localhost:38013
19/11/26 18:24:39 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 18:24:39 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 18:24:39 INFO sdk_worker.run: No more requests from control plane
19/11/26 18:24:39 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 18:24:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 18:24:39 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 18:24:39 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 18:24:39 INFO sdk_worker.run: Done consuming work.
19/11/26 18:24:39 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 18:24:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 18:24:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 18:24:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 18:24:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 18:24:40 INFO sdk_worker_main.main: Logging handler created.
19/11/26 18:24:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:34671
19/11/26 18:24:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 18:24:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 18:24:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574792675.76_3de1076c-6b3d-4a7e-a3c1-d142805cb3b6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 18:24:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574792675.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59823', 'job_port': u'0'}
19/11/26 18:24:40 INFO statecache.__init__: Creating state cache with size 0
19/11/26 18:24:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35257.
19/11/26 18:24:40 INFO sdk_worker.__init__: Control channel established.
19/11/26 18:24:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 18:24:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/26 18:24:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37541.
19/11/26 18:24:40 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 18:24:40 INFO data_plane.create_data_channel: Creating client data channel for localhost:34145
19/11/26 18:24:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 18:24:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 18:24:40 INFO sdk_worker.run: No more requests from control plane
19/11/26 18:24:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 18:24:40 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 18:24:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 18:24:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 18:24:40 INFO sdk_worker.run: Done consuming work.
19/11/26 18:24:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 18:24:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 18:24:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 18:24:40 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574792675.76_3de1076c-6b3d-4a7e-a3c1-d142805cb3b6 finished.
19/11/26 18:24:40 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/26 18:24:40 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_13dc8541-e8e9-4cf3-91a0-be671526cfc4","basePath":"/tmp/sparktestd_SpbU"}: {}
java.io.FileNotFoundException: /tmp/sparktestd_SpbU/job_13dc8541-e8e9-4cf3-91a0-be671526cfc4/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
==================== Timed out after 60 seconds. ====================
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()

# Thread: <Thread(wait_until_finish_read, started daemon 140602456925952)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-117, started daemon 140602465318656)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140603596281600)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140602440140544)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-123, started daemon 140602448533248)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-117, started daemon 140602465318656)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140602456925952)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apach# Thread: <_MainThread(MainThread, started 140603596281600)>
e_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574792666.71_6e024191-3c3b-4113-80c9-7707333d735d failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 315.409s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 50s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/r72nscqulew4a

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1632

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1632/display/redirect?page=changes>

Changes:

[echauchot] [BEAM-8470] move enableSparkMetricSinks option to common spark pipeline


------------------------------------------
[...truncated 1.32 MB...]
19/11/26 13:12:43 INFO sdk_worker_main.start: Status HTTP server running at localhost:37569
19/11/26 13:12:43 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 13:12:43 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 13:12:43 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574773961.06_aa845bd5-0215-4311-b66a-d96d8cd0162b', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 13:12:43 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574773961.06', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56319', 'job_port': u'0'}
19/11/26 13:12:43 INFO statecache.__init__: Creating state cache with size 0
19/11/26 13:12:43 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37633.
19/11/26 13:12:43 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/26 13:12:43 INFO sdk_worker.__init__: Control channel established.
19/11/26 13:12:43 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 13:12:43 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36129.
19/11/26 13:12:43 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 13:12:43 INFO data_plane.create_data_channel: Creating client data channel for localhost:44235
19/11/26 13:12:43 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 13:12:44 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 13:12:44 INFO sdk_worker.run: No more requests from control plane
19/11/26 13:12:44 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 13:12:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 13:12:44 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 13:12:44 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 13:12:44 INFO sdk_worker.run: Done consuming work.
19/11/26 13:12:44 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 13:12:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 13:12:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 13:12:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 13:12:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 13:12:44 INFO sdk_worker_main.main: Logging handler created.
19/11/26 13:12:44 INFO sdk_worker_main.start: Status HTTP server running at localhost:39027
19/11/26 13:12:44 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 13:12:44 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 13:12:44 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574773961.06_aa845bd5-0215-4311-b66a-d96d8cd0162b', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 13:12:44 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574773961.06', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56319', 'job_port': u'0'}
19/11/26 13:12:44 INFO statecache.__init__: Creating state cache with size 0
19/11/26 13:12:44 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43541.
19/11/26 13:12:44 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/26 13:12:44 INFO sdk_worker.__init__: Control channel established.
19/11/26 13:12:44 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 13:12:45 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33311.
19/11/26 13:12:45 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 13:12:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:45359
19/11/26 13:12:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 13:12:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 13:12:45 INFO sdk_worker.run: No more requests from control plane
19/11/26 13:12:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 13:12:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 13:12:45 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 13:12:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 13:12:45 INFO sdk_worker.run: Done consuming work.
19/11/26 13:12:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 13:12:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 13:12:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 13:12:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 13:12:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 13:12:45 INFO sdk_worker_main.main: Logging handler created.
19/11/26 13:12:45 INFO sdk_worker_main.start: Status HTTP server running at localhost:45469
19/11/26 13:12:45 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 13:12:45 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 13:12:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574773961.06_aa845bd5-0215-4311-b66a-d96d8cd0162b', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 13:12:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574773961.06', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56319', 'job_port': u'0'}
19/11/26 13:12:46 INFO statecache.__init__: Creating state cache with size 0
19/11/26 13:12:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43479.
19/11/26 13:12:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/26 13:12:46 INFO sdk_worker.__init__: Control channel established.
19/11/26 13:12:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 13:12:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44813.
19/11/26 13:12:46 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 13:12:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:40145
19/11/26 13:12:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 13:12:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 13:12:46 INFO sdk_worker.run: No more requests from control plane
19/11/26 13:12:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 13:12:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 13:12:46 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 13:12:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 13:12:46 INFO sdk_worker.run: Done consuming work.
19/11/26 13:12:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 13:12:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 13:12:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 13:12:46 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 13:12:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 13:12:46 INFO sdk_worker_main.main: Logging handler created.
19/11/26 13:12:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:39459
19/11/26 13:12:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 13:12:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 13:12:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574773961.06_aa845bd5-0215-4311-b66a-d96d8cd0162b', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 13:12:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574773961.06', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56319', 'job_port': u'0'}
19/11/26 13:12:46 INFO statecache.__init__: Creating state cache with size 0
19/11/26 13:12:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40591.
19/11/26 13:12:46 INFO sdk_worker.__init__: Control channel established.
19/11/26 13:12:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/26 13:12:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 13:12:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44659.
19/11/26 13:12:46 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 13:12:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:44603
19/11/26 13:12:47 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 13:12:47 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 13:12:47 INFO sdk_worker.run: No more requests from control plane
19/11/26 13:12:47 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 13:12:47 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 13:12:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 13:12:47 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 13:12:47 INFO sdk_worker.run: Done consuming work.
19/11/26 13:12:47 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 13:12:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 13:12:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 13:12:47 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574773961.06_aa845bd5-0215-4311-b66a-d96d8cd0162b finished.
19/11/26 13:12:47 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/26 13:12:47 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_f229307b-9dd8-49ef-b472-8c3266a2adcb","basePath":"/tmp/sparktesthGpFaD"}: {}
java.io.FileNotFoundException: /tmp/sparktesthGpFaD/job_f229307b-9dd8-49ef-b472-8c3266a2adcb/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139790431278848)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
# Thread: <Thread(Thread-119, started daemon 139790439671552)>

    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 139791566620416)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 139790414493440)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-125, started daemon 139790422886144)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-119, started daemon 139790439671552)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <_MainThread(MainThread, started 139791566620416)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574773950.71_ec3309df-46eb-4cbc-b027-b2109e1d1f22 failed in state FAILED: java.lang.UnsupportedOperationException: The A# Thread: <Thread(wait_until_finish_read, started daemon 139790431278848)>
ctiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 323.475s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 10s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://gradle.com/s/6p2phocuswu3y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1631

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1631/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/11/26 12:14:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 12:14:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 12:14:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 12:14:40 INFO sdk_worker_main.main: Logging handler created.
19/11/26 12:14:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:46119
19/11/26 12:14:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 12:14:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 12:14:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574770478.3_62e3d065-73c0-40ef-ad5b-7a7d8bd213d2', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 12:14:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574770478.3', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:32881', 'job_port': u'0'}
19/11/26 12:14:40 INFO statecache.__init__: Creating state cache with size 0
19/11/26 12:14:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33029.
19/11/26 12:14:40 INFO sdk_worker.__init__: Control channel established.
19/11/26 12:14:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/26 12:14:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 12:14:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36359.
19/11/26 12:14:40 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 12:14:40 INFO data_plane.create_data_channel: Creating client data channel for localhost:44923
19/11/26 12:14:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 12:14:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 12:14:40 INFO sdk_worker.run: No more requests from control plane
19/11/26 12:14:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 12:14:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 12:14:40 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 12:14:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 12:14:40 INFO sdk_worker.run: Done consuming work.
19/11/26 12:14:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 12:14:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 12:14:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 12:14:41 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 12:14:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 12:14:41 INFO sdk_worker_main.main: Logging handler created.
19/11/26 12:14:41 INFO sdk_worker_main.start: Status HTTP server running at localhost:36265
19/11/26 12:14:41 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 12:14:41 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 12:14:41 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574770478.3_62e3d065-73c0-40ef-ad5b-7a7d8bd213d2', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 12:14:41 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574770478.3', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:32881', 'job_port': u'0'}
19/11/26 12:14:41 INFO statecache.__init__: Creating state cache with size 0
19/11/26 12:14:41 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46849.
19/11/26 12:14:41 INFO sdk_worker.__init__: Control channel established.
19/11/26 12:14:41 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/26 12:14:41 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 12:14:41 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44575.
19/11/26 12:14:41 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 12:14:41 INFO data_plane.create_data_channel: Creating client data channel for localhost:43327
19/11/26 12:14:41 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 12:14:41 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 12:14:41 INFO sdk_worker.run: No more requests from control plane
19/11/26 12:14:41 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 12:14:41 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 12:14:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 12:14:41 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 12:14:41 INFO sdk_worker.run: Done consuming work.
19/11/26 12:14:41 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 12:14:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 12:14:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 12:14:41 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 12:14:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 12:14:42 INFO sdk_worker_main.main: Logging handler created.
19/11/26 12:14:42 INFO sdk_worker_main.start: Status HTTP server running at localhost:42449
19/11/26 12:14:42 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 12:14:42 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 12:14:42 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574770478.3_62e3d065-73c0-40ef-ad5b-7a7d8bd213d2', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 12:14:42 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574770478.3', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:32881', 'job_port': u'0'}
19/11/26 12:14:42 INFO statecache.__init__: Creating state cache with size 0
19/11/26 12:14:42 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41229.
19/11/26 12:14:42 INFO sdk_worker.__init__: Control channel established.
19/11/26 12:14:42 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/26 12:14:42 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 12:14:42 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42527.
19/11/26 12:14:42 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 12:14:42 INFO data_plane.create_data_channel: Creating client data channel for localhost:40719
19/11/26 12:14:42 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 12:14:42 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 12:14:42 INFO sdk_worker.run: No more requests from control plane
19/11/26 12:14:42 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 12:14:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 12:14:42 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 12:14:42 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 12:14:42 INFO sdk_worker.run: Done consuming work.
19/11/26 12:14:42 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 12:14:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 12:14:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 12:14:42 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 12:14:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 12:14:43 INFO sdk_worker_main.main: Logging handler created.
19/11/26 12:14:43 INFO sdk_worker_main.start: Status HTTP server running at localhost:35763
19/11/26 12:14:43 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 12:14:43 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 12:14:43 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574770478.3_62e3d065-73c0-40ef-ad5b-7a7d8bd213d2', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 12:14:43 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574770478.3', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:32881', 'job_port': u'0'}
19/11/26 12:14:43 INFO statecache.__init__: Creating state cache with size 0
19/11/26 12:14:43 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45531.
19/11/26 12:14:43 INFO sdk_worker.__init__: Control channel established.
19/11/26 12:14:43 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 12:14:43 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/26 12:14:43 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37081.
19/11/26 12:14:43 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 12:14:43 INFO data_plane.create_data_channel: Creating client data channel for localhost:37545
19/11/26 12:14:43 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 12:14:43 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 12:14:43 INFO sdk_worker.run: No more requests from control plane
19/11/26 12:14:43 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 12:14:43 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 12:14:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 12:14:43 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 12:14:43 INFO sdk_worker.run: Done consuming work.
19/11/26 12:14:43 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 12:14:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 12:14:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 12:14:43 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574770478.3_62e3d065-73c0-40ef-ad5b-7a7d8bd213d2 finished.
19/11/26 12:14:43 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/26 12:14:43 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_9cc2a8af-11cc-4b4a-97a4-84b56434bf50","basePath":"/tmp/sparktestsgI2fG"}: {}
java.io.FileNotFoundException: /tmp/sparktestsgI2fG/job_9cc2a8af-11cc-4b4a-97a4-84b56434bf50/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
==================== Timed out after 60 seconds. ====================
    assert_that(actual, equal_to(expected))

  File "apache_beam/pipeline.py", line 436, in __exit__
# Thread: <Thread(wait_until_finish_read, started daemon 139653596305152)>

# Thread: <Thread(Thread-117, started daemon 139653604697856)>

    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 139654588397312)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 139653571127040)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, i# Thread: <Thread(Thread-122, started daemon 139653579519744)>

n test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <_MainThread(MainThread, started 139654588397312)>
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574770469.12_b6cbd08f-d55d-49da-a933-4e65dd2a684f failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 288.881s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 56s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/f2jkmhu2fy736

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1630

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1630/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/11/26 06:13:24 INFO sdk_worker_main.start: Status HTTP server running at localhost:42783
19/11/26 06:13:24 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 06:13:24 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 06:13:24 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574748801.88_a44ffbe1-f921-4bcf-a8c3-7eaddea9af78', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 06:13:24 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574748801.88', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54415', 'job_port': u'0'}
19/11/26 06:13:24 INFO statecache.__init__: Creating state cache with size 0
19/11/26 06:13:24 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46007.
19/11/26 06:13:24 INFO sdk_worker.__init__: Control channel established.
19/11/26 06:13:24 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/26 06:13:24 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 06:13:24 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46323.
19/11/26 06:13:24 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 06:13:24 INFO data_plane.create_data_channel: Creating client data channel for localhost:44327
19/11/26 06:13:24 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 06:13:24 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 06:13:24 INFO sdk_worker.run: No more requests from control plane
19/11/26 06:13:24 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 06:13:24 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 06:13:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 06:13:24 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 06:13:24 INFO sdk_worker.run: Done consuming work.
19/11/26 06:13:24 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 06:13:24 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 06:13:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 06:13:24 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 06:13:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 06:13:25 INFO sdk_worker_main.main: Logging handler created.
19/11/26 06:13:25 INFO sdk_worker_main.start: Status HTTP server running at localhost:41749
19/11/26 06:13:25 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 06:13:25 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 06:13:25 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574748801.88_a44ffbe1-f921-4bcf-a8c3-7eaddea9af78', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 06:13:25 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574748801.88', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54415', 'job_port': u'0'}
19/11/26 06:13:25 INFO statecache.__init__: Creating state cache with size 0
19/11/26 06:13:25 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37541.
19/11/26 06:13:25 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/26 06:13:25 INFO sdk_worker.__init__: Control channel established.
19/11/26 06:13:25 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 06:13:25 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37283.
19/11/26 06:13:25 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 06:13:25 INFO data_plane.create_data_channel: Creating client data channel for localhost:32819
19/11/26 06:13:25 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 06:13:25 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 06:13:25 INFO sdk_worker.run: No more requests from control plane
19/11/26 06:13:25 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 06:13:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 06:13:25 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 06:13:25 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 06:13:25 INFO sdk_worker.run: Done consuming work.
19/11/26 06:13:25 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 06:13:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 06:13:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 06:13:25 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 06:13:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 06:13:26 INFO sdk_worker_main.main: Logging handler created.
19/11/26 06:13:26 INFO sdk_worker_main.start: Status HTTP server running at localhost:42179
19/11/26 06:13:26 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 06:13:26 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 06:13:26 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574748801.88_a44ffbe1-f921-4bcf-a8c3-7eaddea9af78', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 06:13:26 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574748801.88', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54415', 'job_port': u'0'}
19/11/26 06:13:26 INFO statecache.__init__: Creating state cache with size 0
19/11/26 06:13:26 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46041.
19/11/26 06:13:26 INFO sdk_worker.__init__: Control channel established.
19/11/26 06:13:26 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/26 06:13:26 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 06:13:26 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36743.
19/11/26 06:13:26 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 06:13:26 INFO data_plane.create_data_channel: Creating client data channel for localhost:41435
19/11/26 06:13:26 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 06:13:26 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 06:13:26 INFO sdk_worker.run: No more requests from control plane
19/11/26 06:13:26 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 06:13:26 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 06:13:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 06:13:26 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 06:13:26 INFO sdk_worker.run: Done consuming work.
19/11/26 06:13:26 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 06:13:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 06:13:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 06:13:26 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 06:13:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 06:13:27 INFO sdk_worker_main.main: Logging handler created.
19/11/26 06:13:27 INFO sdk_worker_main.start: Status HTTP server running at localhost:34225
19/11/26 06:13:27 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 06:13:27 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 06:13:27 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574748801.88_a44ffbe1-f921-4bcf-a8c3-7eaddea9af78', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 06:13:27 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574748801.88', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54415', 'job_port': u'0'}
19/11/26 06:13:27 INFO statecache.__init__: Creating state cache with size 0
19/11/26 06:13:27 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39659.
19/11/26 06:13:27 INFO sdk_worker.__init__: Control channel established.
19/11/26 06:13:27 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 06:13:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/26 06:13:27 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:32791.
19/11/26 06:13:27 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 06:13:27 INFO data_plane.create_data_channel: Creating client data channel for localhost:34521
19/11/26 06:13:27 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 06:13:27 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 06:13:27 INFO sdk_worker.run: No more requests from control plane
19/11/26 06:13:27 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 06:13:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 06:13:27 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 06:13:27 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 06:13:27 INFO sdk_worker.run: Done consuming work.
19/11/26 06:13:27 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 06:13:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 06:13:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 06:13:27 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574748801.88_a44ffbe1-f921-4bcf-a8c3-7eaddea9af78 finished.
19/11/26 06:13:27 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/26 06:13:27 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_724f73c0-f079-4e0d-8a7b-183f8de058ac","basePath":"/tmp/sparktestS7dEH3"}: {}
java.io.FileNotFoundException: /tmp/sparktestS7dEH3/job_724f73c0-f079-4e0d-8a7b-183f8de058ac/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
==================== Timed out after 60 seconds. ====================
    assert_that(actual, equal_to(expected))

  File "apache_beam/pipeline.py", line 436, in __exit__
# Thread: <Thread(wait_until_finish_read, started daemon 140466384054016)>

    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-119, started daemon 140466024605440)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140467163793152)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140466016212736)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-125, started daemon 140466007820032)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(Thread-119, started daemon 140466024605440)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140467163793152)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140466384054016)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574748791.61_c5a7d937-c1d9-4432-9922-3d6bc0b05cee failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 322.805s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 12s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/4azdl35p4qdpa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1629

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1629/display/redirect?page=changes>

Changes:

[suztomo] Dataflow Java worker to avoid undeclared Guava

[suztomo] Beam SQL JDBC driver not to declare unused Guava

[suztomo] KinesisIO to declare Guava dependency

[suztomo] ZetaSQL to declare Guava dependency

[suztomo] Removed unused dependency from elasticsearch-tests-2


------------------------------------------
[...truncated 1.32 MB...]
19/11/26 05:19:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 05:19:15 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 05:19:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 05:19:16 INFO sdk_worker_main.main: Logging handler created.
19/11/26 05:19:16 INFO sdk_worker_main.start: Status HTTP server running at localhost:38553
19/11/26 05:19:16 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 05:19:16 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 05:19:16 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574745553.61_de983ce8-fcfe-4364-a487-858d27dff9a3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 05:19:16 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574745553.61', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48359', 'job_port': u'0'}
19/11/26 05:19:16 INFO statecache.__init__: Creating state cache with size 0
19/11/26 05:19:16 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42363.
19/11/26 05:19:16 INFO sdk_worker.__init__: Control channel established.
19/11/26 05:19:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/26 05:19:16 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 05:19:16 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35309.
19/11/26 05:19:16 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 05:19:16 INFO data_plane.create_data_channel: Creating client data channel for localhost:40083
19/11/26 05:19:16 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 05:19:16 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 05:19:16 INFO sdk_worker.run: No more requests from control plane
19/11/26 05:19:16 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 05:19:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 05:19:16 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 05:19:16 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 05:19:16 INFO sdk_worker.run: Done consuming work.
19/11/26 05:19:16 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 05:19:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 05:19:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 05:19:16 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 05:19:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 05:19:17 INFO sdk_worker_main.main: Logging handler created.
19/11/26 05:19:17 INFO sdk_worker_main.start: Status HTTP server running at localhost:41367
19/11/26 05:19:17 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 05:19:17 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 05:19:17 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574745553.61_de983ce8-fcfe-4364-a487-858d27dff9a3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 05:19:17 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574745553.61', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48359', 'job_port': u'0'}
19/11/26 05:19:17 INFO statecache.__init__: Creating state cache with size 0
19/11/26 05:19:17 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35339.
19/11/26 05:19:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/26 05:19:17 INFO sdk_worker.__init__: Control channel established.
19/11/26 05:19:17 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 05:19:17 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46309.
19/11/26 05:19:17 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 05:19:17 INFO data_plane.create_data_channel: Creating client data channel for localhost:34135
19/11/26 05:19:17 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 05:19:17 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 05:19:17 INFO sdk_worker.run: No more requests from control plane
19/11/26 05:19:17 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 05:19:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 05:19:17 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 05:19:17 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 05:19:17 INFO sdk_worker.run: Done consuming work.
19/11/26 05:19:17 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 05:19:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 05:19:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 05:19:18 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 05:19:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 05:19:18 INFO sdk_worker_main.main: Logging handler created.
19/11/26 05:19:18 INFO sdk_worker_main.start: Status HTTP server running at localhost:44701
19/11/26 05:19:18 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 05:19:18 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 05:19:18 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574745553.61_de983ce8-fcfe-4364-a487-858d27dff9a3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 05:19:18 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574745553.61', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48359', 'job_port': u'0'}
19/11/26 05:19:18 INFO statecache.__init__: Creating state cache with size 0
19/11/26 05:19:18 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33047.
19/11/26 05:19:18 INFO sdk_worker.__init__: Control channel established.
19/11/26 05:19:18 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/26 05:19:18 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 05:19:18 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37171.
19/11/26 05:19:18 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 05:19:18 INFO data_plane.create_data_channel: Creating client data channel for localhost:38725
19/11/26 05:19:18 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 05:19:18 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 05:19:19 INFO sdk_worker.run: No more requests from control plane
19/11/26 05:19:19 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 05:19:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 05:19:19 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 05:19:19 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 05:19:19 INFO sdk_worker.run: Done consuming work.
19/11/26 05:19:19 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 05:19:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 05:19:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 05:19:19 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 05:19:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 05:19:20 INFO sdk_worker_main.main: Logging handler created.
19/11/26 05:19:20 INFO sdk_worker_main.start: Status HTTP server running at localhost:40209
19/11/26 05:19:20 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 05:19:20 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 05:19:20 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574745553.61_de983ce8-fcfe-4364-a487-858d27dff9a3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 05:19:20 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574745553.61', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48359', 'job_port': u'0'}
19/11/26 05:19:20 INFO statecache.__init__: Creating state cache with size 0
19/11/26 05:19:20 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36163.
19/11/26 05:19:20 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/26 05:19:20 INFO sdk_worker.__init__: Control channel established.
19/11/26 05:19:20 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 05:19:20 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42695.
19/11/26 05:19:20 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 05:19:20 INFO data_plane.create_data_channel: Creating client data channel for localhost:42323
19/11/26 05:19:20 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 05:19:20 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 05:19:20 INFO sdk_worker.run: No more requests from control plane
19/11/26 05:19:20 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 05:19:20 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 05:19:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 05:19:20 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 05:19:20 INFO sdk_worker.run: Done consuming work.
19/11/26 05:19:20 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 05:19:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 05:19:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 05:19:20 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574745553.61_de983ce8-fcfe-4364-a487-858d27dff9a3 finished.
19/11/26 05:19:20 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/26 05:19:20 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_6ad509d7-1470-499e-8411-e64526fa6bc8","basePath":"/tmp/sparktestsRNiHC"}: {}
java.io.FileNotFoundException: /tmp/sparktestsRNiHC/job_6ad509d7-1470-499e-8411-e64526fa6bc8/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
==================== Timed out after 60 seconds. ====================
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(wait_until_finish_read, started daemon 140059546216192)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-120, started daemon 140059537823488)>

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
# Thread: <_MainThread(MainThread, started 140060530018048)>
==================== Timed out after 60 seconds. ====================

----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140059512645376)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
# Thread: <Thread(Thread-126, started daemon 140059521038080)>

# Thread: <_MainThread(MainThread, started 140060530018048)>
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574745540.23_5b6d0961-fce7-4d65-b1cc-3c7d96385fd9 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 325.238s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 16s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://gradle.com/s/mafgopxei3aqu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1628

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1628/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/11/26 00:26:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 00:26:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 00:26:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 00:26:05 INFO sdk_worker_main.main: Logging handler created.
19/11/26 00:26:05 INFO sdk_worker_main.start: Status HTTP server running at localhost:34421
19/11/26 00:26:05 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 00:26:05 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 00:26:05 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574727962.59_29075974-d5df-44bd-bb30-d361466f63bf', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 00:26:05 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574727962.59', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:45139', 'job_port': u'0'}
19/11/26 00:26:05 INFO statecache.__init__: Creating state cache with size 0
19/11/26 00:26:05 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42615.
19/11/26 00:26:05 INFO sdk_worker.__init__: Control channel established.
19/11/26 00:26:05 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 00:26:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/26 00:26:05 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35983.
19/11/26 00:26:05 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 00:26:05 INFO data_plane.create_data_channel: Creating client data channel for localhost:44765
19/11/26 00:26:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 00:26:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 00:26:05 INFO sdk_worker.run: No more requests from control plane
19/11/26 00:26:05 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 00:26:05 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 00:26:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 00:26:05 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 00:26:05 INFO sdk_worker.run: Done consuming work.
19/11/26 00:26:05 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 00:26:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 00:26:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 00:26:05 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 00:26:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 00:26:06 INFO sdk_worker_main.main: Logging handler created.
19/11/26 00:26:06 INFO sdk_worker_main.start: Status HTTP server running at localhost:37381
19/11/26 00:26:06 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 00:26:06 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 00:26:06 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574727962.59_29075974-d5df-44bd-bb30-d361466f63bf', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 00:26:06 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574727962.59', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:45139', 'job_port': u'0'}
19/11/26 00:26:06 INFO statecache.__init__: Creating state cache with size 0
19/11/26 00:26:06 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39623.
19/11/26 00:26:06 INFO sdk_worker.__init__: Control channel established.
19/11/26 00:26:06 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 00:26:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/26 00:26:06 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40549.
19/11/26 00:26:06 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 00:26:06 INFO data_plane.create_data_channel: Creating client data channel for localhost:45079
19/11/26 00:26:06 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 00:26:06 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 00:26:06 INFO sdk_worker.run: No more requests from control plane
19/11/26 00:26:06 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 00:26:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 00:26:06 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 00:26:06 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 00:26:06 INFO sdk_worker.run: Done consuming work.
19/11/26 00:26:06 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 00:26:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 00:26:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 00:26:06 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 00:26:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 00:26:06 INFO sdk_worker_main.main: Logging handler created.
19/11/26 00:26:06 INFO sdk_worker_main.start: Status HTTP server running at localhost:32967
19/11/26 00:26:06 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 00:26:06 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 00:26:06 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574727962.59_29075974-d5df-44bd-bb30-d361466f63bf', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 00:26:06 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574727962.59', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:45139', 'job_port': u'0'}
19/11/26 00:26:06 INFO statecache.__init__: Creating state cache with size 0
19/11/26 00:26:06 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45457.
19/11/26 00:26:06 INFO sdk_worker.__init__: Control channel established.
19/11/26 00:26:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/26 00:26:06 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 00:26:06 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40439.
19/11/26 00:26:06 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 00:26:06 INFO data_plane.create_data_channel: Creating client data channel for localhost:44967
19/11/26 00:26:06 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 00:26:06 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 00:26:06 INFO sdk_worker.run: No more requests from control plane
19/11/26 00:26:06 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 00:26:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 00:26:06 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 00:26:06 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 00:26:06 INFO sdk_worker.run: Done consuming work.
19/11/26 00:26:06 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 00:26:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 00:26:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 00:26:07 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 00:26:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 00:26:07 INFO sdk_worker_main.main: Logging handler created.
19/11/26 00:26:07 INFO sdk_worker_main.start: Status HTTP server running at localhost:40403
19/11/26 00:26:07 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 00:26:07 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 00:26:07 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574727962.59_29075974-d5df-44bd-bb30-d361466f63bf', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 00:26:07 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574727962.59', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:45139', 'job_port': u'0'}
19/11/26 00:26:07 INFO statecache.__init__: Creating state cache with size 0
19/11/26 00:26:07 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39789.
19/11/26 00:26:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/26 00:26:07 INFO sdk_worker.__init__: Control channel established.
19/11/26 00:26:07 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 00:26:07 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38997.
19/11/26 00:26:07 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 00:26:07 INFO data_plane.create_data_channel: Creating client data channel for localhost:46621
19/11/26 00:26:07 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 00:26:07 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 00:26:07 INFO sdk_worker.run: No more requests from control plane
19/11/26 00:26:07 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 00:26:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 00:26:07 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 00:26:07 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 00:26:07 INFO sdk_worker.run: Done consuming work.
19/11/26 00:26:07 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 00:26:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 00:26:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 00:26:07 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574727962.59_29075974-d5df-44bd-bb30-d361466f63bf finished.
19/11/26 00:26:07 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/26 00:26:07 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_6b7e2412-94d3-4d5d-9142-ac83919c029a","basePath":"/tmp/sparktest2t2UBP"}: {}
java.io.FileNotFoundException: /tmp/sparktest2t2UBP/job_6b7e2412-94d3-4d5d-9142-ac83919c029a/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
==================== Timed out after 60 seconds. ====================

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
# Thread: <Thread(wait_until_finish_read, started daemon 139639831103232)>

# Thread: <Thread(Thread-120, started daemon 139639822710528)>

# Thread: <_MainThread(MainThread, started 139640618936064)>
==================== Timed out after 60 seconds. ====================

BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 139639805138688)>

  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(Thread-126, started daemon 139639813793536)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574727953.16_5a97825c-4808-4c27-a4e7-1dbc56a65d7e failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 313.508s

# Thread: <_MainThread(MainThread, started 139640618936064)>
FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 42s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/kebpftbuvad3a

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1627

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1627/display/redirect?page=changes>

Changes:

[thw] [BEAM-8815] Skip manifest when no artifacts are staged


------------------------------------------
[...truncated 1.31 MB...]
19/11/25 22:38:37 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1574721516.16_3f5025b7-1561-4649-9eb5-14b47768dd66 on Spark master local
19/11/25 22:38:37 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/11/25 22:38:37 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574721516.16_3f5025b7-1561-4649-9eb5-14b47768dd66: Pipeline translated successfully. Computing outputs
19/11/25 22:38:37 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/25 22:38:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 22:38:37 INFO sdk_worker_main.main: Logging handler created.
19/11/25 22:38:37 INFO sdk_worker_main.start: Status HTTP server running at localhost:39679
19/11/25 22:38:37 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 22:38:37 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 22:38:37 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574721516.16_3f5025b7-1561-4649-9eb5-14b47768dd66', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 22:38:37 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574721516.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58283', 'job_port': u'0'}
19/11/25 22:38:37 INFO statecache.__init__: Creating state cache with size 0
19/11/25 22:38:37 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38589.
19/11/25 22:38:37 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/11/25 22:38:37 INFO sdk_worker.__init__: Control channel established.
19/11/25 22:38:37 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 22:38:37 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34345.
19/11/25 22:38:37 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 22:38:37 INFO data_plane.create_data_channel: Creating client data channel for localhost:43641
19/11/25 22:38:37 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 22:38:38 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 22:38:38 INFO sdk_worker.run: No more requests from control plane
19/11/25 22:38:38 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 22:38:38 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 22:38:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 22:38:38 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 22:38:38 INFO sdk_worker.run: Done consuming work.
19/11/25 22:38:38 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 22:38:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 22:38:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 22:38:38 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/25 22:38:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 22:38:38 INFO sdk_worker_main.main: Logging handler created.
19/11/25 22:38:38 INFO sdk_worker_main.start: Status HTTP server running at localhost:36277
19/11/25 22:38:38 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 22:38:38 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 22:38:38 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574721516.16_3f5025b7-1561-4649-9eb5-14b47768dd66', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 22:38:38 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574721516.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58283', 'job_port': u'0'}
19/11/25 22:38:38 INFO statecache.__init__: Creating state cache with size 0
19/11/25 22:38:38 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40909.
19/11/25 22:38:38 INFO sdk_worker.__init__: Control channel established.
19/11/25 22:38:38 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/25 22:38:38 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 22:38:38 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46531.
19/11/25 22:38:38 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 22:38:38 INFO data_plane.create_data_channel: Creating client data channel for localhost:35217
19/11/25 22:38:38 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 22:38:39 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 22:38:39 INFO sdk_worker.run: No more requests from control plane
19/11/25 22:38:39 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 22:38:39 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 22:38:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 22:38:39 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 22:38:39 INFO sdk_worker.run: Done consuming work.
19/11/25 22:38:39 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 22:38:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 22:38:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 22:38:39 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/25 22:38:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 22:38:39 INFO sdk_worker_main.main: Logging handler created.
19/11/25 22:38:39 INFO sdk_worker_main.start: Status HTTP server running at localhost:44277
19/11/25 22:38:39 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 22:38:39 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 22:38:39 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574721516.16_3f5025b7-1561-4649-9eb5-14b47768dd66', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 22:38:39 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574721516.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58283', 'job_port': u'0'}
19/11/25 22:38:39 INFO statecache.__init__: Creating state cache with size 0
19/11/25 22:38:39 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44761.
19/11/25 22:38:39 INFO sdk_worker.__init__: Control channel established.
19/11/25 22:38:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/25 22:38:39 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 22:38:39 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41277.
19/11/25 22:38:39 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 22:38:39 INFO data_plane.create_data_channel: Creating client data channel for localhost:43829
19/11/25 22:38:39 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 22:38:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 22:38:40 INFO sdk_worker.run: No more requests from control plane
19/11/25 22:38:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 22:38:40 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 22:38:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 22:38:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 22:38:40 INFO sdk_worker.run: Done consuming work.
19/11/25 22:38:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 22:38:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 22:38:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 22:38:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/25 22:38:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 22:38:40 INFO sdk_worker_main.main: Logging handler created.
19/11/25 22:38:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:45581
19/11/25 22:38:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 22:38:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 22:38:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574721516.16_3f5025b7-1561-4649-9eb5-14b47768dd66', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 22:38:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574721516.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58283', 'job_port': u'0'}
19/11/25 22:38:40 INFO statecache.__init__: Creating state cache with size 0
19/11/25 22:38:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33463.
19/11/25 22:38:40 INFO sdk_worker.__init__: Control channel established.
19/11/25 22:38:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/25 22:38:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 22:38:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45105.
19/11/25 22:38:40 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 22:38:40 INFO data_plane.create_data_channel: Creating client data channel for localhost:43837
19/11/25 22:38:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 22:38:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 22:38:40 INFO sdk_worker.run: No more requests from control plane
19/11/25 22:38:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 22:38:40 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 22:38:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 22:38:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 22:38:40 INFO sdk_worker.run: Done consuming work.
19/11/25 22:38:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 22:38:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 22:38:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 22:38:41 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/25 22:38:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 22:38:41 INFO sdk_worker_main.main: Logging handler created.
19/11/25 22:38:41 INFO sdk_worker_main.start: Status HTTP server running at localhost:40571
19/11/25 22:38:41 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 22:38:41 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 22:38:41 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574721516.16_3f5025b7-1561-4649-9eb5-14b47768dd66', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 22:38:41 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574721516.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58283', 'job_port': u'0'}
19/11/25 22:38:41 INFO statecache.__init__: Creating state cache with size 0
19/11/25 22:38:41 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42003.
19/11/25 22:38:41 INFO sdk_worker.__init__: Control channel established.
19/11/25 22:38:41 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/25 22:38:41 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 22:38:41 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38081.
19/11/25 22:38:41 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 22:38:41 INFO data_plane.create_data_channel: Creating client data channel for localhost:44451
19/11/25 22:38:41 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 22:38:41 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 22:38:41 INFO sdk_worker.run: No more requests from control plane
19/11/25 22:38:41 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 22:38:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 22:38:41 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 22:38:41 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 22:38:41 INFO sdk_worker.run: Done consuming work.
19/11/25 22:38:41 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 22:38:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 22:38:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 22:38:42 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574721516.16_3f5025b7-1561-4649-9eb5-14b47768dd66 finished.
19/11/25 22:38:42 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/25 22:38:42 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_c31878fc-d29b-4a84-977f-9a018813be9b","basePath":"/tmp/sparktestw3oExX"}: {}
java.io.FileNotFoundException: /tmp/sparktestw3oExX/job_c31878fc-d29b-4a84-977f-9a018813be9b/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139913911588608)>
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.


======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-120, started daemon 139913919981312)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <_MainThread(MainThread, started 139914903885568)>
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574721506.25_bd164bee-224b-4c11-9b5e-d4aa5e22bdbe failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 306.313s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 59s
60 actionable tasks: 51 executed, 9 from cache

Publishing build scan...
https://gradle.com/s/rru2uo5j44wro

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1626

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1626/display/redirect?page=changes>

Changes:

[worldkzd] fix typos

[github] Update class_test.go

[worldkzd] keep 'a https'


------------------------------------------
[...truncated 1.34 MB...]
19/11/25 20:53:43 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 20:53:43 INFO data_plane.create_data_channel: Creating client data channel for localhost:37353
19/11/25 20:53:43 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 20:53:43 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 20:53:43 INFO sdk_worker.run: No more requests from control plane
19/11/25 20:53:43 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 20:53:44 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 20:53:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 20:53:44 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 20:53:44 INFO sdk_worker.run: Done consuming work.
19/11/25 20:53:44 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 20:53:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 20:53:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 20:53:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest3ZGYBH/job_fb8f0e8b-0b65-43b2-a654-35f9f4899f7b/MANIFEST
19/11/25 20:53:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest3ZGYBH/job_fb8f0e8b-0b65-43b2-a654-35f9f4899f7b/MANIFEST -> 0 artifacts
19/11/25 20:53:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 20:53:44 INFO sdk_worker_main.main: Logging handler created.
19/11/25 20:53:44 INFO sdk_worker_main.start: Status HTTP server running at localhost:41129
19/11/25 20:53:44 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 20:53:44 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 20:53:44 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574715221.49_a5254e23-9906-4e43-9525-49ef97636999', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 20:53:44 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574715221.49', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48315', 'job_port': u'0'}
19/11/25 20:53:44 INFO statecache.__init__: Creating state cache with size 0
19/11/25 20:53:44 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46269.
19/11/25 20:53:44 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/25 20:53:44 INFO sdk_worker.__init__: Control channel established.
19/11/25 20:53:44 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 20:53:44 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46493.
19/11/25 20:53:44 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 20:53:44 INFO data_plane.create_data_channel: Creating client data channel for localhost:46561
19/11/25 20:53:44 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 20:53:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 20:53:45 INFO sdk_worker.run: No more requests from control plane
19/11/25 20:53:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 20:53:45 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 20:53:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 20:53:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 20:53:45 INFO sdk_worker.run: Done consuming work.
19/11/25 20:53:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 20:53:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 20:53:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 20:53:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest3ZGYBH/job_fb8f0e8b-0b65-43b2-a654-35f9f4899f7b/MANIFEST
19/11/25 20:53:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest3ZGYBH/job_fb8f0e8b-0b65-43b2-a654-35f9f4899f7b/MANIFEST -> 0 artifacts
19/11/25 20:53:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 20:53:46 INFO sdk_worker_main.main: Logging handler created.
19/11/25 20:53:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:45085
19/11/25 20:53:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 20:53:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 20:53:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574715221.49_a5254e23-9906-4e43-9525-49ef97636999', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 20:53:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574715221.49', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48315', 'job_port': u'0'}
19/11/25 20:53:46 INFO statecache.__init__: Creating state cache with size 0
19/11/25 20:53:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34715.
19/11/25 20:53:46 INFO sdk_worker.__init__: Control channel established.
19/11/25 20:53:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/25 20:53:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 20:53:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46687.
19/11/25 20:53:46 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 20:53:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:40203
19/11/25 20:53:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 20:53:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 20:53:46 INFO sdk_worker.run: No more requests from control plane
19/11/25 20:53:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 20:53:46 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 20:53:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 20:53:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 20:53:46 INFO sdk_worker.run: Done consuming work.
19/11/25 20:53:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 20:53:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 20:53:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 20:53:46 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest3ZGYBH/job_fb8f0e8b-0b65-43b2-a654-35f9f4899f7b/MANIFEST
19/11/25 20:53:46 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest3ZGYBH/job_fb8f0e8b-0b65-43b2-a654-35f9f4899f7b/MANIFEST -> 0 artifacts
19/11/25 20:53:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 20:53:47 INFO sdk_worker_main.main: Logging handler created.
19/11/25 20:53:47 INFO sdk_worker_main.start: Status HTTP server running at localhost:38943
19/11/25 20:53:47 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 20:53:47 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 20:53:47 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574715221.49_a5254e23-9906-4e43-9525-49ef97636999', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 20:53:47 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574715221.49', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48315', 'job_port': u'0'}
19/11/25 20:53:47 INFO statecache.__init__: Creating state cache with size 0
19/11/25 20:53:47 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36843.
19/11/25 20:53:47 INFO sdk_worker.__init__: Control channel established.
19/11/25 20:53:47 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/25 20:53:47 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 20:53:47 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44809.
19/11/25 20:53:47 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 20:53:47 INFO data_plane.create_data_channel: Creating client data channel for localhost:37811
19/11/25 20:53:47 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 20:53:47 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 20:53:47 INFO sdk_worker.run: No more requests from control plane
19/11/25 20:53:47 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 20:53:47 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 20:53:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 20:53:47 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 20:53:47 INFO sdk_worker.run: Done consuming work.
19/11/25 20:53:47 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 20:53:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 20:53:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 20:53:47 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest3ZGYBH/job_fb8f0e8b-0b65-43b2-a654-35f9f4899f7b/MANIFEST
19/11/25 20:53:47 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest3ZGYBH/job_fb8f0e8b-0b65-43b2-a654-35f9f4899f7b/MANIFEST -> 0 artifacts
19/11/25 20:53:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 20:53:48 INFO sdk_worker_main.main: Logging handler created.
19/11/25 20:53:48 INFO sdk_worker_main.start: Status HTTP server running at localhost:36555
19/11/25 20:53:48 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 20:53:48 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 20:53:48 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574715221.49_a5254e23-9906-4e43-9525-49ef97636999', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 20:53:48 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574715221.49', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48315', 'job_port': u'0'}
19/11/25 20:53:48 INFO statecache.__init__: Creating state cache with size 0
19/11/25 20:53:48 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41469.
19/11/25 20:53:48 INFO sdk_worker.__init__: Control channel established.
19/11/25 20:53:48 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/25 20:53:48 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 20:53:48 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36009.
19/11/25 20:53:48 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 20:53:48 INFO data_plane.create_data_channel: Creating client data channel for localhost:42787
19/11/25 20:53:48 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 20:53:48 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 20:53:48 INFO sdk_worker.run: No more requests from control plane
19/11/25 20:53:48 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 20:53:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 20:53:48 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 20:53:48 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 20:53:48 INFO sdk_worker.run: Done consuming work.
19/11/25 20:53:48 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 20:53:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 20:53:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 20:53:48 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574715221.49_a5254e23-9906-4e43-9525-49ef97636999 finished.
19/11/25 20:53:48 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/25 20:53:48 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktest3ZGYBH/job_fb8f0e8b-0b65-43b2-a654-35f9f4899f7b/MANIFEST has 0 artifact locations
19/11/25 20:53:48 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktest3ZGYBH/job_fb8f0e8b-0b65-43b2-a654-35f9f4899f7b/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140190636517120)>

    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
# Thread: <Thread(Thread-120, started daemon 140190907688704)>

    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 140191423080192)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(wait_until_finish_read, started daemon 140190139016960)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-126, started daemon 140190147671808)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <_MainThread(MainThread, started 140191423080192)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574715211.25_5fe56ce4-06a8-4c15-af24-c39004005585 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 313.578s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 6s
60 actionable tasks: 49 executed, 11 from cache

Publishing build scan...
https://gradle.com/s/5x7h2f6dswfhs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1625

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1625/display/redirect?page=changes>

Changes:

[robertwb] [BEAM-8575] Added a unit test that Reshuffle preserves timestamps


------------------------------------------
[...truncated 1.34 MB...]
19/11/25 19:27:07 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 19:27:07 INFO data_plane.create_data_channel: Creating client data channel for localhost:43261
19/11/25 19:27:07 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 19:27:07 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 19:27:07 INFO sdk_worker.run: No more requests from control plane
19/11/25 19:27:07 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 19:27:07 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 19:27:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 19:27:07 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 19:27:07 INFO sdk_worker.run: Done consuming work.
19/11/25 19:27:07 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 19:27:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 19:27:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 19:27:07 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest5DeGYq/job_06365812-f64b-4977-951a-7f95ddd36557/MANIFEST
19/11/25 19:27:07 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest5DeGYq/job_06365812-f64b-4977-951a-7f95ddd36557/MANIFEST -> 0 artifacts
19/11/25 19:27:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 19:27:08 INFO sdk_worker_main.main: Logging handler created.
19/11/25 19:27:08 INFO sdk_worker_main.start: Status HTTP server running at localhost:39387
19/11/25 19:27:08 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 19:27:08 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 19:27:08 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574710026.02_bb0c8209-df8e-4caa-a8f0-dbcd2163b6b1', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 19:27:08 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574710026.02', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43289', 'job_port': u'0'}
19/11/25 19:27:08 INFO statecache.__init__: Creating state cache with size 0
19/11/25 19:27:08 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45087.
19/11/25 19:27:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/25 19:27:08 INFO sdk_worker.__init__: Control channel established.
19/11/25 19:27:08 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 19:27:08 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41533.
19/11/25 19:27:08 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 19:27:08 INFO data_plane.create_data_channel: Creating client data channel for localhost:40457
19/11/25 19:27:08 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 19:27:08 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 19:27:08 INFO sdk_worker.run: No more requests from control plane
19/11/25 19:27:08 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 19:27:08 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 19:27:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 19:27:08 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 19:27:08 INFO sdk_worker.run: Done consuming work.
19/11/25 19:27:08 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 19:27:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 19:27:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 19:27:08 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest5DeGYq/job_06365812-f64b-4977-951a-7f95ddd36557/MANIFEST
19/11/25 19:27:08 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest5DeGYq/job_06365812-f64b-4977-951a-7f95ddd36557/MANIFEST -> 0 artifacts
19/11/25 19:27:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 19:27:09 INFO sdk_worker_main.main: Logging handler created.
19/11/25 19:27:09 INFO sdk_worker_main.start: Status HTTP server running at localhost:45817
19/11/25 19:27:09 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 19:27:09 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 19:27:09 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574710026.02_bb0c8209-df8e-4caa-a8f0-dbcd2163b6b1', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 19:27:09 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574710026.02', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43289', 'job_port': u'0'}
19/11/25 19:27:09 INFO statecache.__init__: Creating state cache with size 0
19/11/25 19:27:09 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43743.
19/11/25 19:27:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/25 19:27:09 INFO sdk_worker.__init__: Control channel established.
19/11/25 19:27:09 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 19:27:09 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44775.
19/11/25 19:27:09 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 19:27:09 INFO data_plane.create_data_channel: Creating client data channel for localhost:40719
19/11/25 19:27:09 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 19:27:09 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 19:27:09 INFO sdk_worker.run: No more requests from control plane
19/11/25 19:27:09 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 19:27:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 19:27:09 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 19:27:09 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 19:27:09 INFO sdk_worker.run: Done consuming work.
19/11/25 19:27:09 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 19:27:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 19:27:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 19:27:09 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest5DeGYq/job_06365812-f64b-4977-951a-7f95ddd36557/MANIFEST
19/11/25 19:27:09 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest5DeGYq/job_06365812-f64b-4977-951a-7f95ddd36557/MANIFEST -> 0 artifacts
19/11/25 19:27:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 19:27:10 INFO sdk_worker_main.main: Logging handler created.
19/11/25 19:27:10 INFO sdk_worker_main.start: Status HTTP server running at localhost:45001
19/11/25 19:27:10 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 19:27:10 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 19:27:10 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574710026.02_bb0c8209-df8e-4caa-a8f0-dbcd2163b6b1', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 19:27:10 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574710026.02', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43289', 'job_port': u'0'}
19/11/25 19:27:10 INFO statecache.__init__: Creating state cache with size 0
19/11/25 19:27:10 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39953.
19/11/25 19:27:10 INFO sdk_worker.__init__: Control channel established.
19/11/25 19:27:10 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/25 19:27:10 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 19:27:10 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40245.
19/11/25 19:27:10 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 19:27:10 INFO data_plane.create_data_channel: Creating client data channel for localhost:46551
19/11/25 19:27:10 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 19:27:10 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 19:27:10 INFO sdk_worker.run: No more requests from control plane
19/11/25 19:27:10 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 19:27:10 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 19:27:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 19:27:10 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 19:27:10 INFO sdk_worker.run: Done consuming work.
19/11/25 19:27:10 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 19:27:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 19:27:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 19:27:10 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest5DeGYq/job_06365812-f64b-4977-951a-7f95ddd36557/MANIFEST
19/11/25 19:27:10 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest5DeGYq/job_06365812-f64b-4977-951a-7f95ddd36557/MANIFEST -> 0 artifacts
19/11/25 19:27:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 19:27:11 INFO sdk_worker_main.main: Logging handler created.
19/11/25 19:27:11 INFO sdk_worker_main.start: Status HTTP server running at localhost:40923
19/11/25 19:27:11 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 19:27:11 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 19:27:11 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574710026.02_bb0c8209-df8e-4caa-a8f0-dbcd2163b6b1', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 19:27:11 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574710026.02', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43289', 'job_port': u'0'}
19/11/25 19:27:11 INFO statecache.__init__: Creating state cache with size 0
19/11/25 19:27:11 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43769.
19/11/25 19:27:11 INFO sdk_worker.__init__: Control channel established.
19/11/25 19:27:11 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/25 19:27:11 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 19:27:11 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41647.
19/11/25 19:27:11 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 19:27:11 INFO data_plane.create_data_channel: Creating client data channel for localhost:40895
19/11/25 19:27:11 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 19:27:11 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 19:27:11 INFO sdk_worker.run: No more requests from control plane
19/11/25 19:27:11 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 19:27:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 19:27:11 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 19:27:11 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 19:27:11 INFO sdk_worker.run: Done consuming work.
19/11/25 19:27:11 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 19:27:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 19:27:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 19:27:11 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574710026.02_bb0c8209-df8e-4caa-a8f0-dbcd2163b6b1 finished.
19/11/25 19:27:11 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/25 19:27:11 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktest5DeGYq/job_06365812-f64b-4977-951a-7f95ddd36557/MANIFEST has 0 artifact locations
19/11/25 19:27:11 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktest5DeGYq/job_06365812-f64b-4977-951a-7f95ddd36557/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139978944124672)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)

# Thread: <Thread(Thread-118, started daemon 139978935731968)>

# Thread: <_MainThread(MainThread, started 139979723355904)>
BaseException: Timed out after 60 seconds.

==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139978909505280)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-124, started daemon 139978918160128)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
# Thread: <_MainThread(MainThread, started 139979723355904)>
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574710016.39_c5e4dd49-5c46-48de-a5f2-dba53c6ab232 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 301.054s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 19s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/ywtai6yipuhzo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1624

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1624/display/redirect?page=changes>

Changes:

[pabloem] [BEAM-876] Support schemaUpdateOption in BigQueryIO (#9524)


------------------------------------------
[...truncated 1.34 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:28:56 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:28:56 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:28:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:28:56 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:28:56 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:28:56 INFO sdk_worker.run: Done consuming work.
19/11/25 18:28:56 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:28:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:28:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:28:56 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestikWkB3/job_c47c43d1-9598-4280-860e-c65c5768047e/MANIFEST
19/11/25 18:28:56 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestikWkB3/job_c47c43d1-9598-4280-860e-c65c5768047e/MANIFEST -> 0 artifacts
19/11/25 18:28:57 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 18:28:57 INFO sdk_worker_main.main: Logging handler created.
19/11/25 18:28:57 INFO sdk_worker_main.start: Status HTTP server running at localhost:39573
19/11/25 18:28:57 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 18:28:57 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 18:28:57 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574706534.31_06e05ea0-b99e-415a-9c52-4c48deec4c71', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 18:28:57 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574706534.31', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36165', 'job_port': u'0'}
19/11/25 18:28:57 INFO statecache.__init__: Creating state cache with size 0
19/11/25 18:28:57 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42527.
19/11/25 18:28:57 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/25 18:28:57 INFO sdk_worker.__init__: Control channel established.
19/11/25 18:28:57 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 18:28:57 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46191.
19/11/25 18:28:57 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 18:28:57 INFO data_plane.create_data_channel: Creating client data channel for localhost:44641
19/11/25 18:28:57 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 18:28:57 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:28:57 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:28:57 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:28:57 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:28:57 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:28:57 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:28:57 INFO sdk_worker.run: Done consuming work.
19/11/25 18:28:57 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:28:57 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:28:57 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:28:57 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestikWkB3/job_c47c43d1-9598-4280-860e-c65c5768047e/MANIFEST
19/11/25 18:28:57 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestikWkB3/job_c47c43d1-9598-4280-860e-c65c5768047e/MANIFEST -> 0 artifacts
19/11/25 18:28:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 18:28:58 INFO sdk_worker_main.main: Logging handler created.
19/11/25 18:28:58 INFO sdk_worker_main.start: Status HTTP server running at localhost:41997
19/11/25 18:28:58 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 18:28:58 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 18:28:58 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574706534.31_06e05ea0-b99e-415a-9c52-4c48deec4c71', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 18:28:58 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574706534.31', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36165', 'job_port': u'0'}
19/11/25 18:28:58 INFO statecache.__init__: Creating state cache with size 0
19/11/25 18:28:58 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35043.
19/11/25 18:28:58 INFO sdk_worker.__init__: Control channel established.
19/11/25 18:28:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/25 18:28:58 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 18:28:58 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38185.
19/11/25 18:28:58 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 18:28:58 INFO data_plane.create_data_channel: Creating client data channel for localhost:45325
19/11/25 18:28:58 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 18:28:58 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:28:58 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:28:58 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:28:58 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:28:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:28:58 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:28:58 INFO sdk_worker.run: Done consuming work.
19/11/25 18:28:58 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:28:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:28:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:28:58 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestikWkB3/job_c47c43d1-9598-4280-860e-c65c5768047e/MANIFEST
19/11/25 18:28:58 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestikWkB3/job_c47c43d1-9598-4280-860e-c65c5768047e/MANIFEST -> 0 artifacts
19/11/25 18:28:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 18:28:59 INFO sdk_worker_main.main: Logging handler created.
19/11/25 18:28:59 INFO sdk_worker_main.start: Status HTTP server running at localhost:46085
19/11/25 18:28:59 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 18:28:59 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 18:28:59 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574706534.31_06e05ea0-b99e-415a-9c52-4c48deec4c71', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 18:28:59 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574706534.31', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36165', 'job_port': u'0'}
19/11/25 18:28:59 INFO statecache.__init__: Creating state cache with size 0
19/11/25 18:28:59 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42507.
19/11/25 18:28:59 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/25 18:28:59 INFO sdk_worker.__init__: Control channel established.
19/11/25 18:28:59 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 18:28:59 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39541.
19/11/25 18:28:59 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 18:28:59 INFO data_plane.create_data_channel: Creating client data channel for localhost:45217
19/11/25 18:28:59 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 18:28:59 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:28:59 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:28:59 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:28:59 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:28:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:28:59 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:28:59 INFO sdk_worker.run: Done consuming work.
19/11/25 18:28:59 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:28:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:28:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:28:59 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestikWkB3/job_c47c43d1-9598-4280-860e-c65c5768047e/MANIFEST
19/11/25 18:28:59 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestikWkB3/job_c47c43d1-9598-4280-860e-c65c5768047e/MANIFEST -> 0 artifacts
19/11/25 18:29:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 18:29:00 INFO sdk_worker_main.main: Logging handler created.
19/11/25 18:29:00 INFO sdk_worker_main.start: Status HTTP server running at localhost:41575
19/11/25 18:29:00 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 18:29:00 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 18:29:00 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574706534.31_06e05ea0-b99e-415a-9c52-4c48deec4c71', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 18:29:00 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574706534.31', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36165', 'job_port': u'0'}
19/11/25 18:29:00 INFO statecache.__init__: Creating state cache with size 0
19/11/25 18:29:00 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40795.
19/11/25 18:29:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/25 18:29:00 INFO sdk_worker.__init__: Control channel established.
19/11/25 18:29:00 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 18:29:00 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39147.
19/11/25 18:29:00 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 18:29:00 INFO data_plane.create_data_channel: Creating client data channel for localhost:36415
19/11/25 18:29:00 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 18:29:00 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:29:00 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:29:00 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:29:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:29:00 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:29:00 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:29:00 INFO sdk_worker.run: Done consuming work.
19/11/25 18:29:00 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:29:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:29:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:29:00 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574706534.31_06e05ea0-b99e-415a-9c52-4c48deec4c71 finished.
19/11/25 18:29:00 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/25 18:29:00 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestikWkB3/job_c47c43d1-9598-4280-860e-c65c5768047e/MANIFEST has 0 artifact locations
19/11/25 18:29:00 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestikWkB3/job_c47c43d1-9598-4280-860e-c65c5768047e/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
==================== Timed out after 60 seconds. ====================

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 140261138609920)>

  File "apache_beam/runners/portability/portable_ru# Thread: <Thread(Thread-117, started daemon 140261130217216)>

nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 140261920012032)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 140261105039104)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-123, started daemon 140261113431808)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-117, started daemon 140261130217216)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140261920012032)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(wait_until_finish_read, started daemon 140261138609920)>
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574706523.59_34fd9717-f676-40a1-a3e0-77e094720053 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 338.116s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 24s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/akdxcgncuten2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1623

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1623/display/redirect>

Changes:


------------------------------------------
[...truncated 1.34 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:16:20 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:16:20 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:16:20 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:16:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:16:20 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:16:20 INFO sdk_worker.run: Done consuming work.
19/11/25 18:16:20 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:16:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:16:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:16:20 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestBZ3zK_/job_694de2b4-d1be-423b-b717-7bd965e47230/MANIFEST
19/11/25 18:16:20 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestBZ3zK_/job_694de2b4-d1be-423b-b717-7bd965e47230/MANIFEST -> 0 artifacts
19/11/25 18:16:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 18:16:20 INFO sdk_worker_main.main: Logging handler created.
19/11/25 18:16:20 INFO sdk_worker_main.start: Status HTTP server running at localhost:45193
19/11/25 18:16:20 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 18:16:20 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 18:16:20 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574705778.15_153f640c-4c42-4b26-84ac-6e62a59e5b50', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 18:16:20 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574705778.15', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46059', 'job_port': u'0'}
19/11/25 18:16:20 INFO statecache.__init__: Creating state cache with size 0
19/11/25 18:16:20 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40919.
19/11/25 18:16:20 INFO sdk_worker.__init__: Control channel established.
19/11/25 18:16:20 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/25 18:16:20 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 18:16:20 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38741.
19/11/25 18:16:20 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 18:16:20 INFO data_plane.create_data_channel: Creating client data channel for localhost:36275
19/11/25 18:16:20 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 18:16:20 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:16:20 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:16:20 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:16:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:16:20 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:16:20 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:16:20 INFO sdk_worker.run: Done consuming work.
19/11/25 18:16:20 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:16:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:16:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:16:21 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestBZ3zK_/job_694de2b4-d1be-423b-b717-7bd965e47230/MANIFEST
19/11/25 18:16:21 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestBZ3zK_/job_694de2b4-d1be-423b-b717-7bd965e47230/MANIFEST -> 0 artifacts
19/11/25 18:16:21 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 18:16:21 INFO sdk_worker_main.main: Logging handler created.
19/11/25 18:16:21 INFO sdk_worker_main.start: Status HTTP server running at localhost:40223
19/11/25 18:16:21 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 18:16:21 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 18:16:21 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574705778.15_153f640c-4c42-4b26-84ac-6e62a59e5b50', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 18:16:21 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574705778.15', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46059', 'job_port': u'0'}
19/11/25 18:16:21 INFO statecache.__init__: Creating state cache with size 0
19/11/25 18:16:21 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34533.
19/11/25 18:16:21 INFO sdk_worker.__init__: Control channel established.
19/11/25 18:16:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/25 18:16:21 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 18:16:21 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36123.
19/11/25 18:16:21 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 18:16:21 INFO data_plane.create_data_channel: Creating client data channel for localhost:32887
19/11/25 18:16:21 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 18:16:21 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:16:21 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:16:21 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:16:21 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:16:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:16:21 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:16:21 INFO sdk_worker.run: Done consuming work.
19/11/25 18:16:21 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:16:21 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:16:22 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:16:22 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestBZ3zK_/job_694de2b4-d1be-423b-b717-7bd965e47230/MANIFEST
19/11/25 18:16:22 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestBZ3zK_/job_694de2b4-d1be-423b-b717-7bd965e47230/MANIFEST -> 0 artifacts
19/11/25 18:16:22 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 18:16:22 INFO sdk_worker_main.main: Logging handler created.
19/11/25 18:16:22 INFO sdk_worker_main.start: Status HTTP server running at localhost:40559
19/11/25 18:16:22 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 18:16:22 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 18:16:22 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574705778.15_153f640c-4c42-4b26-84ac-6e62a59e5b50', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 18:16:22 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574705778.15', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46059', 'job_port': u'0'}
19/11/25 18:16:22 INFO statecache.__init__: Creating state cache with size 0
19/11/25 18:16:22 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46197.
19/11/25 18:16:22 INFO sdk_worker.__init__: Control channel established.
19/11/25 18:16:22 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 18:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/25 18:16:22 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43217.
19/11/25 18:16:22 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 18:16:22 INFO data_plane.create_data_channel: Creating client data channel for localhost:40773
19/11/25 18:16:22 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 18:16:22 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:16:22 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:16:22 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:16:22 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:16:22 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:16:22 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:16:22 INFO sdk_worker.run: Done consuming work.
19/11/25 18:16:22 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:16:22 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:16:22 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:16:22 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestBZ3zK_/job_694de2b4-d1be-423b-b717-7bd965e47230/MANIFEST
19/11/25 18:16:22 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestBZ3zK_/job_694de2b4-d1be-423b-b717-7bd965e47230/MANIFEST -> 0 artifacts
19/11/25 18:16:23 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 18:16:23 INFO sdk_worker_main.main: Logging handler created.
19/11/25 18:16:23 INFO sdk_worker_main.start: Status HTTP server running at localhost:35041
19/11/25 18:16:23 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 18:16:23 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 18:16:23 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574705778.15_153f640c-4c42-4b26-84ac-6e62a59e5b50', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 18:16:23 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574705778.15', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46059', 'job_port': u'0'}
19/11/25 18:16:23 INFO statecache.__init__: Creating state cache with size 0
19/11/25 18:16:23 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35799.
19/11/25 18:16:23 INFO sdk_worker.__init__: Control channel established.
19/11/25 18:16:23 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/25 18:16:23 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 18:16:23 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44101.
19/11/25 18:16:23 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 18:16:23 INFO data_plane.create_data_channel: Creating client data channel for localhost:38201
19/11/25 18:16:23 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 18:16:23 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:16:23 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:16:23 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:16:23 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:16:23 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:16:23 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:16:23 INFO sdk_worker.run: Done consuming work.
19/11/25 18:16:23 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:16:23 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:16:23 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:16:23 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574705778.15_153f640c-4c42-4b26-84ac-6e62a59e5b50 finished.
19/11/25 18:16:23 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/25 18:16:23 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestBZ3zK_/job_694de2b4-d1be-423b-b717-7bd965e47230/MANIFEST has 0 artifact locations
19/11/25 18:16:23 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestBZ3zK_/job_694de2b4-d1be-423b-b717-7bd965e47230/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
# Thread: <Thread(wait_until_finish_read, started daemon 140318498010880)>

    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-117, started daemon 140318489618176)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 140319279621888)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140317856626432)>

# Thread: <Thread(Thread-123, started daemon 140318472832768)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-117, started daemon 140318489618176)>

# Thread: <_MainThread(MainThread, started 140319279621888)>

# Thread: <Thread(wait_until_finish_read, started daemon 140318498010880)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574705766.73_cce27912-03b8-4bf9-838f-6754d487ee92 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 311.314s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 1s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/4ux44a3gk6sy6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1622

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1622/display/redirect?page=changes>

Changes:

[kirillkozlov] Fix MongoDb SQL Integration Tests

[kirillkozlov] Add MongoDbIT back to build file

[kirillkozlov] Update JavaDoc comment and remove pipeline options


------------------------------------------
[...truncated 1.33 MB...]
19/11/25 18:02:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:02:51 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowed_pardo_state_timers_1574704965.96_86f8fc35-b902-475f-b232-756f5ca4d93f finished.
19/11/25 18:02:51 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/25 18:02:51 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestn5Xgq_/job_75e9ec83-f03c-4c4a-8c2c-30f486aafb98/MANIFEST has 0 artifact locations
19/11/25 18:02:51 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestn5Xgq_/job_75e9ec83-f03c-4c4a-8c2c-30f486aafb98/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function lift_combiners at 0x7f77bda41230> ====================
19/11/25 18:02:52 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job test_windowing_1574704971.37_eb845bb5-00c1-4c5b-906b-006f0a3cc0e8
19/11/25 18:02:52 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation test_windowing_1574704971.37_eb845bb5-00c1-4c5b-906b-006f0a3cc0e8
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/11/25 18:02:52 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/11/25 18:02:52 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/11/25 18:02:52 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1574704971.37_eb845bb5-00c1-4c5b-906b-006f0a3cc0e8 on Spark master local
19/11/25 18:02:52 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/11/25 18:02:52 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574704971.37_eb845bb5-00c1-4c5b-906b-006f0a3cc0e8: Pipeline translated successfully. Computing outputs
19/11/25 18:02:52 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestn5Xgq_/job_72db3842-f758-4c7a-a5e6-0639753c0db6/MANIFEST
19/11/25 18:02:52 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestn5Xgq_/job_72db3842-f758-4c7a-a5e6-0639753c0db6/MANIFEST has 0 artifact locations
19/11/25 18:02:52 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestn5Xgq_/job_72db3842-f758-4c7a-a5e6-0639753c0db6/MANIFEST -> 0 artifacts
19/11/25 18:02:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 18:02:53 INFO sdk_worker_main.main: Logging handler created.
19/11/25 18:02:53 INFO sdk_worker_main.start: Status HTTP server running at localhost:40841
19/11/25 18:02:53 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 18:02:53 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 18:02:53 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574704971.37_eb845bb5-00c1-4c5b-906b-006f0a3cc0e8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 18:02:53 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574704971.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:33413', 'job_port': u'0'}
19/11/25 18:02:53 INFO statecache.__init__: Creating state cache with size 0
19/11/25 18:02:53 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40113.
19/11/25 18:02:53 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/11/25 18:02:53 INFO sdk_worker.__init__: Control channel established.
19/11/25 18:02:53 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 18:02:53 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34777.
19/11/25 18:02:53 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 18:02:53 INFO data_plane.create_data_channel: Creating client data channel for localhost:36689
19/11/25 18:02:53 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 18:02:53 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:02:53 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:02:53 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:02:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:02:53 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:02:53 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:02:53 INFO sdk_worker.run: Done consuming work.
19/11/25 18:02:53 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:02:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:02:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:02:53 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestn5Xgq_/job_72db3842-f758-4c7a-a5e6-0639753c0db6/MANIFEST
19/11/25 18:02:53 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestn5Xgq_/job_72db3842-f758-4c7a-a5e6-0639753c0db6/MANIFEST -> 0 artifacts
19/11/25 18:02:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 18:02:53 INFO sdk_worker_main.main: Logging handler created.
19/11/25 18:02:53 INFO sdk_worker_main.start: Status HTTP server running at localhost:41363
19/11/25 18:02:53 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 18:02:53 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 18:02:53 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574704971.37_eb845bb5-00c1-4c5b-906b-006f0a3cc0e8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 18:02:53 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574704971.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:33413', 'job_port': u'0'}
19/11/25 18:02:53 INFO statecache.__init__: Creating state cache with size 0
19/11/25 18:02:53 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34201.
19/11/25 18:02:53 INFO sdk_worker.__init__: Control channel established.
19/11/25 18:02:53 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 18:02:53 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/25 18:02:53 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44899.
19/11/25 18:02:53 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 18:02:53 INFO data_plane.create_data_channel: Creating client data channel for localhost:42715
19/11/25 18:02:53 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 18:02:53 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:02:53 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:02:53 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:02:53 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:02:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:02:53 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:02:53 INFO sdk_worker.run: Done consuming work.
19/11/25 18:02:53 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:02:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:02:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:02:54 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestn5Xgq_/job_72db3842-f758-4c7a-a5e6-0639753c0db6/MANIFEST
19/11/25 18:02:54 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestn5Xgq_/job_72db3842-f758-4c7a-a5e6-0639753c0db6/MANIFEST -> 0 artifacts
19/11/25 18:02:54 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 18:02:54 INFO sdk_worker_main.main: Logging handler created.
19/11/25 18:02:54 INFO sdk_worker_main.start: Status HTTP server running at localhost:35859
19/11/25 18:02:54 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 18:02:54 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 18:02:54 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574704971.37_eb845bb5-00c1-4c5b-906b-006f0a3cc0e8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 18:02:54 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574704971.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:33413', 'job_port': u'0'}
19/11/25 18:02:54 INFO statecache.__init__: Creating state cache with size 0
19/11/25 18:02:54 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40877.
19/11/25 18:02:54 INFO sdk_worker.__init__: Control channel established.
19/11/25 18:02:54 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 18:02:54 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/25 18:02:54 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41493.
19/11/25 18:02:54 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 18:02:54 INFO data_plane.create_data_channel: Creating client data channel for localhost:40425
19/11/25 18:02:54 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 18:02:54 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:02:54 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:02:54 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:02:54 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:02:54 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:02:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:02:54 INFO sdk_worker.run: Done consuming work.
19/11/25 18:02:54 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:02:54 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:02:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:02:54 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestn5Xgq_/job_72db3842-f758-4c7a-a5e6-0639753c0db6/MANIFEST
19/11/25 18:02:54 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestn5Xgq_/job_72db3842-f758-4c7a-a5e6-0639753c0db6/MANIFEST -> 0 artifacts
19/11/25 18:02:55 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 18:02:55 INFO sdk_worker_main.main: Logging handler created.
19/11/25 18:02:55 INFO sdk_worker_main.start: Status HTTP server running at localhost:40325
19/11/25 18:02:55 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 18:02:55 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 18:02:55 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574704971.37_eb845bb5-00c1-4c5b-906b-006f0a3cc0e8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 18:02:55 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574704971.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:33413', 'job_port': u'0'}
19/11/25 18:02:55 INFO statecache.__init__: Creating state cache with size 0
19/11/25 18:02:55 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43215.
19/11/25 18:02:55 INFO sdk_worker.__init__: Control channel established.
19/11/25 18:02:55 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 18:02:55 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/25 18:02:55 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36697.
19/11/25 18:02:55 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 18:02:55 INFO data_plane.create_data_channel: Creating client data channel for localhost:46427
19/11/25 18:02:55 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 18:02:55 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:02:55 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:02:55 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:02:55 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:02:55 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:02:55 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:02:55 INFO sdk_worker.run: Done consuming work.
19/11/25 18:02:55 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:02:55 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:02:55 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:02:55 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestn5Xgq_/job_72db3842-f758-4c7a-a5e6-0639753c0db6/MANIFEST
19/11/25 18:02:55 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestn5Xgq_/job_72db3842-f758-4c7a-a5e6-0639753c0db6/MANIFEST -> 0 artifacts
19/11/25 18:02:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 18:02:56 INFO sdk_worker_main.main: Logging handler created.
19/11/25 18:02:56 INFO sdk_worker_main.start: Status HTTP server running at localhost:46675
19/11/25 18:02:56 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 18:02:56 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 18:02:56 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574704971.37_eb845bb5-00c1-4c5b-906b-006f0a3cc0e8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 18:02:56 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574704971.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:33413', 'job_port': u'0'}
19/11/25 18:02:56 INFO statecache.__init__: Creating state cache with size 0
19/11/25 18:02:56 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46777.
19/11/25 18:02:56 INFO sdk_worker.__init__: Control channel established.
19/11/25 18:02:56 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/25 18:02:56 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 18:02:56 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44623.
19/11/25 18:02:56 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 18:02:56 INFO data_plane.create_data_channel: Creating client data channel for localhost:41611
19/11/25 18:02:56 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 18:02:56 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:02:56 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:02:56 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:02:56 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:02:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:02:56 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:02:56 INFO sdk_worker.run: Done consuming work.
19/11/25 18:02:56 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:02:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:02:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:02:56 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574704971.37_eb845bb5-00c1-4c5b-906b-006f0a3cc0e8 finished.
19/11/25 18:02:56 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/25 18:02:56 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestn5Xgq_/job_72db3842-f758-4c7a-a5e6-0639753c0db6/MANIFEST has 0 artifact locations
19/11/25 18:02:56 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestn5Xgq_/job_72db3842-f758-4c7a-a5e6-0639753c0db6/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
==================== Timed out after 60 seconds. ====================
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler

# Thread: <Thread(wait_until_finish_read, started daemon 140151745402624)>

# Thread: <Thread(Thread-117, started daemon 140151737009920)>

    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
# Thread: <_MainThread(MainThread, started 140152740226816)>
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574704962.48_5d4d7c3e-4fdb-41cb-96b3-148f2977583e failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 285.147s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 9s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/vnndw2fs3wdhg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1621

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1621/display/redirect>

Changes:


------------------------------------------
[...truncated 1.34 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 12:10:57 INFO sdk_worker.run: No more requests from control plane
19/11/25 12:10:57 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 12:10:57 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 12:10:57 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 12:10:57 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 12:10:57 INFO sdk_worker.run: Done consuming work.
19/11/25 12:10:57 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 12:10:57 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 12:10:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 12:10:58 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestzb50d6/job_ef50b933-2ad7-4339-91db-ef785b85da23/MANIFEST
19/11/25 12:10:58 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestzb50d6/job_ef50b933-2ad7-4339-91db-ef785b85da23/MANIFEST -> 0 artifacts
19/11/25 12:10:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 12:10:58 INFO sdk_worker_main.main: Logging handler created.
19/11/25 12:10:58 INFO sdk_worker_main.start: Status HTTP server running at localhost:33809
19/11/25 12:10:58 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 12:10:58 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 12:10:58 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574683856.16_6a8c438d-2aa7-4dea-b5b6-5c3dc9251dcd', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 12:10:58 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574683856.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35875', 'job_port': u'0'}
19/11/25 12:10:58 INFO statecache.__init__: Creating state cache with size 0
19/11/25 12:10:58 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41607.
19/11/25 12:10:58 INFO sdk_worker.__init__: Control channel established.
19/11/25 12:10:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/25 12:10:58 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 12:10:58 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46043.
19/11/25 12:10:58 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 12:10:58 INFO data_plane.create_data_channel: Creating client data channel for localhost:44049
19/11/25 12:10:58 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 12:10:58 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 12:10:58 INFO sdk_worker.run: No more requests from control plane
19/11/25 12:10:58 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 12:10:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 12:10:58 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 12:10:58 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 12:10:58 INFO sdk_worker.run: Done consuming work.
19/11/25 12:10:58 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 12:10:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 12:10:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 12:10:59 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestzb50d6/job_ef50b933-2ad7-4339-91db-ef785b85da23/MANIFEST
19/11/25 12:10:59 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestzb50d6/job_ef50b933-2ad7-4339-91db-ef785b85da23/MANIFEST -> 0 artifacts
19/11/25 12:10:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 12:10:59 INFO sdk_worker_main.main: Logging handler created.
19/11/25 12:10:59 INFO sdk_worker_main.start: Status HTTP server running at localhost:39955
19/11/25 12:10:59 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 12:10:59 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 12:10:59 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574683856.16_6a8c438d-2aa7-4dea-b5b6-5c3dc9251dcd', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 12:10:59 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574683856.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35875', 'job_port': u'0'}
19/11/25 12:10:59 INFO statecache.__init__: Creating state cache with size 0
19/11/25 12:10:59 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45485.
19/11/25 12:10:59 INFO sdk_worker.__init__: Control channel established.
19/11/25 12:10:59 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 12:10:59 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/25 12:10:59 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42405.
19/11/25 12:10:59 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 12:10:59 INFO data_plane.create_data_channel: Creating client data channel for localhost:42259
19/11/25 12:10:59 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 12:10:59 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 12:10:59 INFO sdk_worker.run: No more requests from control plane
19/11/25 12:10:59 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 12:10:59 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 12:10:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 12:10:59 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 12:10:59 INFO sdk_worker.run: Done consuming work.
19/11/25 12:10:59 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 12:10:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 12:11:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 12:11:00 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestzb50d6/job_ef50b933-2ad7-4339-91db-ef785b85da23/MANIFEST
19/11/25 12:11:00 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestzb50d6/job_ef50b933-2ad7-4339-91db-ef785b85da23/MANIFEST -> 0 artifacts
19/11/25 12:11:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 12:11:00 INFO sdk_worker_main.main: Logging handler created.
19/11/25 12:11:00 INFO sdk_worker_main.start: Status HTTP server running at localhost:36559
19/11/25 12:11:00 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 12:11:00 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 12:11:00 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574683856.16_6a8c438d-2aa7-4dea-b5b6-5c3dc9251dcd', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 12:11:00 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574683856.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35875', 'job_port': u'0'}
19/11/25 12:11:00 INFO statecache.__init__: Creating state cache with size 0
19/11/25 12:11:00 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33507.
19/11/25 12:11:00 INFO sdk_worker.__init__: Control channel established.
19/11/25 12:11:00 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 12:11:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/25 12:11:00 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42291.
19/11/25 12:11:00 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 12:11:00 INFO data_plane.create_data_channel: Creating client data channel for localhost:44873
19/11/25 12:11:00 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 12:11:00 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 12:11:00 INFO sdk_worker.run: No more requests from control plane
19/11/25 12:11:00 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 12:11:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 12:11:00 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 12:11:00 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 12:11:00 INFO sdk_worker.run: Done consuming work.
19/11/25 12:11:00 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 12:11:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 12:11:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 12:11:01 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestzb50d6/job_ef50b933-2ad7-4339-91db-ef785b85da23/MANIFEST
19/11/25 12:11:01 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestzb50d6/job_ef50b933-2ad7-4339-91db-ef785b85da23/MANIFEST -> 0 artifacts
19/11/25 12:11:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 12:11:01 INFO sdk_worker_main.main: Logging handler created.
19/11/25 12:11:01 INFO sdk_worker_main.start: Status HTTP server running at localhost:35669
19/11/25 12:11:01 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 12:11:01 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 12:11:01 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574683856.16_6a8c438d-2aa7-4dea-b5b6-5c3dc9251dcd', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 12:11:01 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574683856.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35875', 'job_port': u'0'}
19/11/25 12:11:01 INFO statecache.__init__: Creating state cache with size 0
19/11/25 12:11:01 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37843.
19/11/25 12:11:01 INFO sdk_worker.__init__: Control channel established.
19/11/25 12:11:01 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 12:11:01 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/25 12:11:01 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46211.
19/11/25 12:11:01 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 12:11:01 INFO data_plane.create_data_channel: Creating client data channel for localhost:38813
19/11/25 12:11:01 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 12:11:01 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 12:11:01 INFO sdk_worker.run: No more requests from control plane
19/11/25 12:11:01 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 12:11:01 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 12:11:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 12:11:01 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 12:11:01 INFO sdk_worker.run: Done consuming work.
19/11/25 12:11:01 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 12:11:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 12:11:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 12:11:01 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574683856.16_6a8c438d-2aa7-4dea-b5b6-5c3dc9251dcd finished.
19/11/25 12:11:01 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/25 12:11:01 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestzb50d6/job_ef50b933-2ad7-4339-91db-ef785b85da23/MANIFEST has 0 artifact locations
19/11/25 12:11:01 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestzb50d6/job_ef50b933-2ad7-4339-91db-ef785b85da23/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
==================== Timed out after 60 seconds. ====================
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler

# Thread: <Thread(wait_until_finish_read, started daemon 139803754198784)>

    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
# Thread: <Thread(Thread-120, started daemon 139803745806080)>

ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <_MainThread(MainThread, started 139804549363456)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
# Thread: <Thread(wait_until_finish_read, started daemon 139803257464576)>

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-124, started daemon 139803265857280)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
# Thread: <_MainThread(MainThread, started 139804549363456)>

# Thread: <Thread(wait_until_finish_read, started daemon 139803754198784)>

    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
# Thread: <Thread(Thread-120, started daemon 139803745806080)>
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574683844.64_ab82ddb2-947f-4863-9a8b-3c188b83bed6 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 324.524s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 9s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/on2lff5fisjbs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1620

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1620/display/redirect>

Changes:


------------------------------------------
[...truncated 1.34 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 06:16:26 INFO sdk_worker.run: No more requests from control plane
19/11/25 06:16:26 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 06:16:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 06:16:26 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 06:16:26 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 06:16:26 INFO sdk_worker.run: Done consuming work.
19/11/25 06:16:26 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 06:16:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 06:16:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 06:16:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestDjyB2X/job_cad19b92-71fc-48ce-8889-c139730105bf/MANIFEST
19/11/25 06:16:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestDjyB2X/job_cad19b92-71fc-48ce-8889-c139730105bf/MANIFEST -> 0 artifacts
19/11/25 06:16:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 06:16:27 INFO sdk_worker_main.main: Logging handler created.
19/11/25 06:16:27 INFO sdk_worker_main.start: Status HTTP server running at localhost:45403
19/11/25 06:16:27 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 06:16:27 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 06:16:27 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574662584.74_29599733-7164-427a-85a1-6373e04db2de', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 06:16:27 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574662584.74', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52089', 'job_port': u'0'}
19/11/25 06:16:27 INFO statecache.__init__: Creating state cache with size 0
19/11/25 06:16:27 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33469.
19/11/25 06:16:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/25 06:16:27 INFO sdk_worker.__init__: Control channel established.
19/11/25 06:16:27 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 06:16:27 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33503.
19/11/25 06:16:27 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 06:16:27 INFO data_plane.create_data_channel: Creating client data channel for localhost:33411
19/11/25 06:16:27 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 06:16:27 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 06:16:27 INFO sdk_worker.run: No more requests from control plane
19/11/25 06:16:27 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 06:16:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 06:16:27 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 06:16:27 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 06:16:27 INFO sdk_worker.run: Done consuming work.
19/11/25 06:16:27 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 06:16:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 06:16:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 06:16:28 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestDjyB2X/job_cad19b92-71fc-48ce-8889-c139730105bf/MANIFEST
19/11/25 06:16:28 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestDjyB2X/job_cad19b92-71fc-48ce-8889-c139730105bf/MANIFEST -> 0 artifacts
19/11/25 06:16:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 06:16:28 INFO sdk_worker_main.main: Logging handler created.
19/11/25 06:16:28 INFO sdk_worker_main.start: Status HTTP server running at localhost:37509
19/11/25 06:16:28 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 06:16:28 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 06:16:28 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574662584.74_29599733-7164-427a-85a1-6373e04db2de', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 06:16:28 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574662584.74', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52089', 'job_port': u'0'}
19/11/25 06:16:28 INFO statecache.__init__: Creating state cache with size 0
19/11/25 06:16:28 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45343.
19/11/25 06:16:28 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/25 06:16:28 INFO sdk_worker.__init__: Control channel established.
19/11/25 06:16:28 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 06:16:28 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38331.
19/11/25 06:16:28 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 06:16:28 INFO data_plane.create_data_channel: Creating client data channel for localhost:43569
19/11/25 06:16:28 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 06:16:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 06:16:29 INFO sdk_worker.run: No more requests from control plane
19/11/25 06:16:29 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 06:16:29 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 06:16:29 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 06:16:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 06:16:29 INFO sdk_worker.run: Done consuming work.
19/11/25 06:16:29 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 06:16:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 06:16:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 06:16:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestDjyB2X/job_cad19b92-71fc-48ce-8889-c139730105bf/MANIFEST
19/11/25 06:16:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestDjyB2X/job_cad19b92-71fc-48ce-8889-c139730105bf/MANIFEST -> 0 artifacts
19/11/25 06:16:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 06:16:30 INFO sdk_worker_main.main: Logging handler created.
19/11/25 06:16:30 INFO sdk_worker_main.start: Status HTTP server running at localhost:43721
19/11/25 06:16:30 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 06:16:30 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 06:16:30 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574662584.74_29599733-7164-427a-85a1-6373e04db2de', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 06:16:30 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574662584.74', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52089', 'job_port': u'0'}
19/11/25 06:16:30 INFO statecache.__init__: Creating state cache with size 0
19/11/25 06:16:30 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38623.
19/11/25 06:16:30 INFO sdk_worker.__init__: Control channel established.
19/11/25 06:16:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/25 06:16:30 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 06:16:30 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40411.
19/11/25 06:16:30 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 06:16:30 INFO data_plane.create_data_channel: Creating client data channel for localhost:38903
19/11/25 06:16:30 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 06:16:30 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 06:16:30 INFO sdk_worker.run: No more requests from control plane
19/11/25 06:16:30 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 06:16:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 06:16:30 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 06:16:30 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 06:16:30 INFO sdk_worker.run: Done consuming work.
19/11/25 06:16:30 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 06:16:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 06:16:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 06:16:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestDjyB2X/job_cad19b92-71fc-48ce-8889-c139730105bf/MANIFEST
19/11/25 06:16:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestDjyB2X/job_cad19b92-71fc-48ce-8889-c139730105bf/MANIFEST -> 0 artifacts
19/11/25 06:16:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 06:16:31 INFO sdk_worker_main.main: Logging handler created.
19/11/25 06:16:31 INFO sdk_worker_main.start: Status HTTP server running at localhost:43027
19/11/25 06:16:31 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 06:16:31 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 06:16:31 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574662584.74_29599733-7164-427a-85a1-6373e04db2de', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 06:16:31 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574662584.74', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52089', 'job_port': u'0'}
19/11/25 06:16:31 INFO statecache.__init__: Creating state cache with size 0
19/11/25 06:16:31 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35533.
19/11/25 06:16:31 INFO sdk_worker.__init__: Control channel established.
19/11/25 06:16:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/25 06:16:31 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 06:16:31 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44111.
19/11/25 06:16:31 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 06:16:31 INFO data_plane.create_data_channel: Creating client data channel for localhost:42837
19/11/25 06:16:31 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 06:16:31 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 06:16:31 INFO sdk_worker.run: No more requests from control plane
19/11/25 06:16:31 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 06:16:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 06:16:31 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 06:16:31 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 06:16:31 INFO sdk_worker.run: Done consuming work.
19/11/25 06:16:31 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 06:16:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 06:16:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 06:16:31 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574662584.74_29599733-7164-427a-85a1-6373e04db2de finished.
19/11/25 06:16:31 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/25 06:16:31 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestDjyB2X/job_cad19b92-71fc-48ce-8889-c139730105bf/MANIFEST has 0 artifact locations
19/11/25 06:16:31 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestDjyB2X/job_cad19b92-71fc-48ce-8889-c139730105bf/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
==================== Timed out after 60 seconds. ====================
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers

    assert_that(actual, equal_to(expected))
# Thread: <Thread(wait_until_finish_read, started daemon 140247383926528)>

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-117, started daemon 140247375533824)>

# Thread: <_MainThread(MainThread, started 140248510146304)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140247358748416)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)

# Thread: <Thread(Thread-123, started daemon 140247367141120)>

# Thread: <_MainThread(MainThread, started 140248510146304)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-117, started daemon 140247375533824)>

# Thread: <Thread(wait_until_finish_read, started daemon 140247383926528)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574662573.02_5011cda4-931d-40ef-a51b-56a93311a735 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 352.103s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 51s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/3mnmfgcocm5lk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1619

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1619/display/redirect>

Changes:


------------------------------------------
[...truncated 1.33 MB...]
19/11/25 00:12:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 00:12:03 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowed_pardo_state_timers_1574640717.82_3656bd3d-312a-4a68-846a-c4cf1c74dd3f finished.
19/11/25 00:12:03 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/25 00:12:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktest4sfRF0/job_ce26435b-7f53-4efb-ae83-d7e748c7075d/MANIFEST has 0 artifact locations
19/11/25 00:12:03 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktest4sfRF0/job_ce26435b-7f53-4efb-ae83-d7e748c7075d/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function lift_combiners at 0x7f7286534230> ====================
19/11/25 00:12:04 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job test_windowing_1574640723.08_0d363e57-9d14-4ca2-8a77-50f53142ffc5
19/11/25 00:12:04 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation test_windowing_1574640723.08_0d363e57-9d14-4ca2-8a77-50f53142ffc5
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/11/25 00:12:04 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/11/25 00:12:04 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/11/25 00:12:04 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1574640723.08_0d363e57-9d14-4ca2-8a77-50f53142ffc5 on Spark master local
19/11/25 00:12:04 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/11/25 00:12:04 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574640723.08_0d363e57-9d14-4ca2-8a77-50f53142ffc5: Pipeline translated successfully. Computing outputs
19/11/25 00:12:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4sfRF0/job_e8e0b38e-f150-4a46-a53f-25adfa15a47c/MANIFEST
19/11/25 00:12:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktest4sfRF0/job_e8e0b38e-f150-4a46-a53f-25adfa15a47c/MANIFEST has 0 artifact locations
19/11/25 00:12:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4sfRF0/job_e8e0b38e-f150-4a46-a53f-25adfa15a47c/MANIFEST -> 0 artifacts
19/11/25 00:12:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 00:12:04 INFO sdk_worker_main.main: Logging handler created.
19/11/25 00:12:04 INFO sdk_worker_main.start: Status HTTP server running at localhost:40625
19/11/25 00:12:04 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 00:12:04 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 00:12:04 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574640723.08_0d363e57-9d14-4ca2-8a77-50f53142ffc5', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 00:12:04 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574640723.08', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42673', 'job_port': u'0'}
19/11/25 00:12:04 INFO statecache.__init__: Creating state cache with size 0
19/11/25 00:12:04 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42961.
19/11/25 00:12:04 INFO sdk_worker.__init__: Control channel established.
19/11/25 00:12:04 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 00:12:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/11/25 00:12:04 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39927.
19/11/25 00:12:04 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 00:12:04 INFO data_plane.create_data_channel: Creating client data channel for localhost:35461
19/11/25 00:12:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 00:12:04 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 00:12:04 INFO sdk_worker.run: No more requests from control plane
19/11/25 00:12:04 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 00:12:04 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 00:12:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 00:12:04 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 00:12:04 INFO sdk_worker.run: Done consuming work.
19/11/25 00:12:04 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 00:12:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 00:12:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 00:12:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4sfRF0/job_e8e0b38e-f150-4a46-a53f-25adfa15a47c/MANIFEST
19/11/25 00:12:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4sfRF0/job_e8e0b38e-f150-4a46-a53f-25adfa15a47c/MANIFEST -> 0 artifacts
19/11/25 00:12:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 00:12:05 INFO sdk_worker_main.main: Logging handler created.
19/11/25 00:12:05 INFO sdk_worker_main.start: Status HTTP server running at localhost:42933
19/11/25 00:12:05 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 00:12:05 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 00:12:05 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574640723.08_0d363e57-9d14-4ca2-8a77-50f53142ffc5', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 00:12:05 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574640723.08', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42673', 'job_port': u'0'}
19/11/25 00:12:05 INFO statecache.__init__: Creating state cache with size 0
19/11/25 00:12:05 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36185.
19/11/25 00:12:05 INFO sdk_worker.__init__: Control channel established.
19/11/25 00:12:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/25 00:12:05 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 00:12:05 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37889.
19/11/25 00:12:05 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 00:12:05 INFO data_plane.create_data_channel: Creating client data channel for localhost:37753
19/11/25 00:12:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 00:12:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 00:12:05 INFO sdk_worker.run: No more requests from control plane
19/11/25 00:12:05 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 00:12:05 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 00:12:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 00:12:05 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 00:12:05 INFO sdk_worker.run: Done consuming work.
19/11/25 00:12:05 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 00:12:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 00:12:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 00:12:05 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4sfRF0/job_e8e0b38e-f150-4a46-a53f-25adfa15a47c/MANIFEST
19/11/25 00:12:05 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4sfRF0/job_e8e0b38e-f150-4a46-a53f-25adfa15a47c/MANIFEST -> 0 artifacts
19/11/25 00:12:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 00:12:06 INFO sdk_worker_main.main: Logging handler created.
19/11/25 00:12:06 INFO sdk_worker_main.start: Status HTTP server running at localhost:44627
19/11/25 00:12:06 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 00:12:06 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 00:12:06 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574640723.08_0d363e57-9d14-4ca2-8a77-50f53142ffc5', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 00:12:06 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574640723.08', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42673', 'job_port': u'0'}
19/11/25 00:12:06 INFO statecache.__init__: Creating state cache with size 0
19/11/25 00:12:06 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35351.
19/11/25 00:12:06 INFO sdk_worker.__init__: Control channel established.
19/11/25 00:12:06 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 00:12:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/25 00:12:06 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43383.
19/11/25 00:12:06 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 00:12:06 INFO data_plane.create_data_channel: Creating client data channel for localhost:38823
19/11/25 00:12:06 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 00:12:06 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 00:12:06 INFO sdk_worker.run: No more requests from control plane
19/11/25 00:12:06 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 00:12:06 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 00:12:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 00:12:06 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 00:12:06 INFO sdk_worker.run: Done consuming work.
19/11/25 00:12:06 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 00:12:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 00:12:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 00:12:06 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4sfRF0/job_e8e0b38e-f150-4a46-a53f-25adfa15a47c/MANIFEST
19/11/25 00:12:06 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4sfRF0/job_e8e0b38e-f150-4a46-a53f-25adfa15a47c/MANIFEST -> 0 artifacts
19/11/25 00:12:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 00:12:07 INFO sdk_worker_main.main: Logging handler created.
19/11/25 00:12:07 INFO sdk_worker_main.start: Status HTTP server running at localhost:32845
19/11/25 00:12:07 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 00:12:07 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 00:12:07 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574640723.08_0d363e57-9d14-4ca2-8a77-50f53142ffc5', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 00:12:07 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574640723.08', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42673', 'job_port': u'0'}
19/11/25 00:12:07 INFO statecache.__init__: Creating state cache with size 0
19/11/25 00:12:07 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42771.
19/11/25 00:12:07 INFO sdk_worker.__init__: Control channel established.
19/11/25 00:12:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/25 00:12:07 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 00:12:07 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44161.
19/11/25 00:12:07 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 00:12:07 INFO data_plane.create_data_channel: Creating client data channel for localhost:38357
19/11/25 00:12:07 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 00:12:07 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 00:12:07 INFO sdk_worker.run: No more requests from control plane
19/11/25 00:12:07 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 00:12:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 00:12:07 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 00:12:07 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 00:12:07 INFO sdk_worker.run: Done consuming work.
19/11/25 00:12:07 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 00:12:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 00:12:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 00:12:07 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4sfRF0/job_e8e0b38e-f150-4a46-a53f-25adfa15a47c/MANIFEST
19/11/25 00:12:07 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4sfRF0/job_e8e0b38e-f150-4a46-a53f-25adfa15a47c/MANIFEST -> 0 artifacts
19/11/25 00:12:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 00:12:07 INFO sdk_worker_main.main: Logging handler created.
19/11/25 00:12:07 INFO sdk_worker_main.start: Status HTTP server running at localhost:35337
19/11/25 00:12:07 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 00:12:07 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 00:12:07 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574640723.08_0d363e57-9d14-4ca2-8a77-50f53142ffc5', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 00:12:07 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574640723.08', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42673', 'job_port': u'0'}
19/11/25 00:12:07 INFO statecache.__init__: Creating state cache with size 0
19/11/25 00:12:07 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44931.
19/11/25 00:12:07 INFO sdk_worker.__init__: Control channel established.
19/11/25 00:12:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/25 00:12:07 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 00:12:07 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41141.
19/11/25 00:12:07 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 00:12:07 INFO data_plane.create_data_channel: Creating client data channel for localhost:45377
19/11/25 00:12:07 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 00:12:07 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 00:12:07 INFO sdk_worker.run: No more requests from control plane
19/11/25 00:12:07 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 00:12:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 00:12:08 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 00:12:08 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 00:12:08 INFO sdk_worker.run: Done consuming work.
19/11/25 00:12:08 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 00:12:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 00:12:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 00:12:08 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574640723.08_0d363e57-9d14-4ca2-8a77-50f53142ffc5 finished.
19/11/25 00:12:08 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/25 00:12:08 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktest4sfRF0/job_e8e0b38e-f150-4a46-a53f-25adfa15a47c/MANIFEST has 0 artifact locations
19/11/25 00:12:08 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktest4sfRF0/job_e8e0b38e-f150-4a46-a53f-25adfa15a47c/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
==================== Timed out after 60 seconds. ====================
    _common.wait(self._state.condition.wait, _response_ready)

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 140129263933184)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "ap# Thread: <Thread(Thread-118, started daemon 140129280718592)>

# Thread: <_MainThread(MainThread, started 140130133227264)>
ache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574640714.42_6b489e63-beb7-49c3-b55c-710eb8e0cf6f failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 267.897s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 56s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/admhpvb6ipom4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1618

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1618/display/redirect?page=changes>

Changes:

[github] [BEAM-8803] BigQuery Streaming Inserts are always retried by default.


------------------------------------------
[...truncated 1.34 MB...]
19/11/24 22:03:59 INFO sdk_worker.create_state_handler: State channel established.
19/11/24 22:03:59 INFO data_plane.create_data_channel: Creating client data channel for localhost:45097
19/11/24 22:03:59 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/24 22:03:59 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/24 22:03:59 INFO sdk_worker.run: No more requests from control plane
19/11/24 22:03:59 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/24 22:03:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 22:03:59 INFO data_plane.close: Closing all cached grpc data channels.
19/11/24 22:03:59 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/24 22:03:59 INFO sdk_worker.run: Done consuming work.
19/11/24 22:03:59 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/24 22:03:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/24 22:03:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 22:03:59 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest1XH4CR/job_f3e944ba-0ad7-434e-af6f-695e8828bda3/MANIFEST
19/11/24 22:03:59 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest1XH4CR/job_f3e944ba-0ad7-434e-af6f-695e8828bda3/MANIFEST -> 0 artifacts
19/11/24 22:04:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/24 22:04:00 INFO sdk_worker_main.main: Logging handler created.
19/11/24 22:04:00 INFO sdk_worker_main.start: Status HTTP server running at localhost:46859
19/11/24 22:04:00 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/24 22:04:00 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/24 22:04:00 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574633037.9_b06215fe-525e-48f4-87aa-64f935809647', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/24 22:04:00 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574633037.9', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53041', 'job_port': u'0'}
19/11/24 22:04:00 INFO statecache.__init__: Creating state cache with size 0
19/11/24 22:04:00 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43341.
19/11/24 22:04:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/24 22:04:00 INFO sdk_worker.__init__: Control channel established.
19/11/24 22:04:00 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/24 22:04:00 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42101.
19/11/24 22:04:00 INFO sdk_worker.create_state_handler: State channel established.
19/11/24 22:04:00 INFO data_plane.create_data_channel: Creating client data channel for localhost:46819
19/11/24 22:04:00 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/24 22:04:00 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/24 22:04:00 INFO sdk_worker.run: No more requests from control plane
19/11/24 22:04:00 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/24 22:04:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 22:04:00 INFO data_plane.close: Closing all cached grpc data channels.
19/11/24 22:04:00 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/24 22:04:00 INFO sdk_worker.run: Done consuming work.
19/11/24 22:04:00 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/24 22:04:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/24 22:04:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 22:04:00 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest1XH4CR/job_f3e944ba-0ad7-434e-af6f-695e8828bda3/MANIFEST
19/11/24 22:04:00 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest1XH4CR/job_f3e944ba-0ad7-434e-af6f-695e8828bda3/MANIFEST -> 0 artifacts
19/11/24 22:04:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/24 22:04:01 INFO sdk_worker_main.main: Logging handler created.
19/11/24 22:04:01 INFO sdk_worker_main.start: Status HTTP server running at localhost:38977
19/11/24 22:04:01 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/24 22:04:01 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/24 22:04:01 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574633037.9_b06215fe-525e-48f4-87aa-64f935809647', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/24 22:04:01 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574633037.9', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53041', 'job_port': u'0'}
19/11/24 22:04:01 INFO statecache.__init__: Creating state cache with size 0
19/11/24 22:04:01 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39587.
19/11/24 22:04:01 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/24 22:04:01 INFO sdk_worker.__init__: Control channel established.
19/11/24 22:04:01 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/24 22:04:01 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44629.
19/11/24 22:04:01 INFO sdk_worker.create_state_handler: State channel established.
19/11/24 22:04:01 INFO data_plane.create_data_channel: Creating client data channel for localhost:37745
19/11/24 22:04:01 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/24 22:04:01 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/24 22:04:01 INFO sdk_worker.run: No more requests from control plane
19/11/24 22:04:01 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/24 22:04:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 22:04:01 INFO data_plane.close: Closing all cached grpc data channels.
19/11/24 22:04:01 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/24 22:04:01 INFO sdk_worker.run: Done consuming work.
19/11/24 22:04:01 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/24 22:04:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/24 22:04:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 22:04:01 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest1XH4CR/job_f3e944ba-0ad7-434e-af6f-695e8828bda3/MANIFEST
19/11/24 22:04:01 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest1XH4CR/job_f3e944ba-0ad7-434e-af6f-695e8828bda3/MANIFEST -> 0 artifacts
19/11/24 22:04:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/24 22:04:02 INFO sdk_worker_main.main: Logging handler created.
19/11/24 22:04:02 INFO sdk_worker_main.start: Status HTTP server running at localhost:43757
19/11/24 22:04:02 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/24 22:04:02 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/24 22:04:02 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574633037.9_b06215fe-525e-48f4-87aa-64f935809647', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/24 22:04:02 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574633037.9', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53041', 'job_port': u'0'}
19/11/24 22:04:02 INFO statecache.__init__: Creating state cache with size 0
19/11/24 22:04:02 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34207.
19/11/24 22:04:02 INFO sdk_worker.__init__: Control channel established.
19/11/24 22:04:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/24 22:04:02 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/24 22:04:02 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34365.
19/11/24 22:04:02 INFO sdk_worker.create_state_handler: State channel established.
19/11/24 22:04:02 INFO data_plane.create_data_channel: Creating client data channel for localhost:41095
19/11/24 22:04:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/24 22:04:02 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/24 22:04:02 INFO sdk_worker.run: No more requests from control plane
19/11/24 22:04:02 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/24 22:04:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 22:04:02 INFO data_plane.close: Closing all cached grpc data channels.
19/11/24 22:04:02 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/24 22:04:02 INFO sdk_worker.run: Done consuming work.
19/11/24 22:04:02 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/24 22:04:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/24 22:04:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 22:04:02 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest1XH4CR/job_f3e944ba-0ad7-434e-af6f-695e8828bda3/MANIFEST
19/11/24 22:04:02 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest1XH4CR/job_f3e944ba-0ad7-434e-af6f-695e8828bda3/MANIFEST -> 0 artifacts
19/11/24 22:04:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/24 22:04:03 INFO sdk_worker_main.main: Logging handler created.
19/11/24 22:04:03 INFO sdk_worker_main.start: Status HTTP server running at localhost:41567
19/11/24 22:04:03 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/24 22:04:03 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/24 22:04:03 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574633037.9_b06215fe-525e-48f4-87aa-64f935809647', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/24 22:04:03 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574633037.9', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53041', 'job_port': u'0'}
19/11/24 22:04:03 INFO statecache.__init__: Creating state cache with size 0
19/11/24 22:04:03 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33919.
19/11/24 22:04:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/24 22:04:03 INFO sdk_worker.__init__: Control channel established.
19/11/24 22:04:03 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/24 22:04:03 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45077.
19/11/24 22:04:03 INFO sdk_worker.create_state_handler: State channel established.
19/11/24 22:04:03 INFO data_plane.create_data_channel: Creating client data channel for localhost:44645
19/11/24 22:04:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/24 22:04:03 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/24 22:04:03 INFO sdk_worker.run: No more requests from control plane
19/11/24 22:04:03 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/24 22:04:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 22:04:03 INFO data_plane.close: Closing all cached grpc data channels.
19/11/24 22:04:03 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/24 22:04:03 INFO sdk_worker.run: Done consuming work.
19/11/24 22:04:03 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/24 22:04:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/24 22:04:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 22:04:03 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574633037.9_b06215fe-525e-48f4-87aa-64f935809647 finished.
19/11/24 22:04:03 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/24 22:04:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktest1XH4CR/job_f3e944ba-0ad7-434e-af6f-695e8828bda3/MANIFEST has 0 artifact locations
19/11/24 22:04:03 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktest1XH4CR/job_f3e944ba-0ad7-434e-af6f-695e8828bda3/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(wait_until_finish_read, started daemon 139835201283840)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-119, started daemon 139835209676544)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apach# Thread: <_MainThread(MainThread, started 139836068628224)>
==================== Timed out after 60 seconds. ====================

e_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(wait_until_finish_read, started daemon 139835184498432)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574633027.73_b11ce318-d852-4860-8c2f-87751080226a failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <Thread(Thread-125, started daemon 139835192891136)>

# Thread: <_MainThread(MainThread, started 139836068628224)>
----------------------------------------------------------------------
Ran 38 tests in 311.239s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 2s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/jy34b2gaeubvo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1617

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1617/display/redirect>

Changes:


------------------------------------------
[...truncated 1.34 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/24 18:12:42 INFO sdk_worker.run: No more requests from control plane
19/11/24 18:12:42 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/24 18:12:42 INFO data_plane.close: Closing all cached grpc data channels.
19/11/24 18:12:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 18:12:42 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/24 18:12:42 INFO sdk_worker.run: Done consuming work.
19/11/24 18:12:42 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/24 18:12:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/24 18:12:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 18:12:42 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestXZVdpS/job_c7fc6347-c1bb-47e6-b530-611580f87e8a/MANIFEST
19/11/24 18:12:42 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestXZVdpS/job_c7fc6347-c1bb-47e6-b530-611580f87e8a/MANIFEST -> 0 artifacts
19/11/24 18:12:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/24 18:12:43 INFO sdk_worker_main.main: Logging handler created.
19/11/24 18:12:43 INFO sdk_worker_main.start: Status HTTP server running at localhost:35093
19/11/24 18:12:43 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/24 18:12:43 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/24 18:12:43 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574619161.09_1af24fd7-e549-4087-8c79-b07dc7f43611', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/24 18:12:43 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574619161.09', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46255', 'job_port': u'0'}
19/11/24 18:12:43 INFO statecache.__init__: Creating state cache with size 0
19/11/24 18:12:43 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40417.
19/11/24 18:12:43 INFO sdk_worker.__init__: Control channel established.
19/11/24 18:12:43 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/24 18:12:43 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/24 18:12:43 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40511.
19/11/24 18:12:43 INFO sdk_worker.create_state_handler: State channel established.
19/11/24 18:12:43 INFO data_plane.create_data_channel: Creating client data channel for localhost:45009
19/11/24 18:12:43 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/24 18:12:43 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/24 18:12:43 INFO sdk_worker.run: No more requests from control plane
19/11/24 18:12:43 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/24 18:12:43 INFO data_plane.close: Closing all cached grpc data channels.
19/11/24 18:12:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 18:12:43 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/24 18:12:43 INFO sdk_worker.run: Done consuming work.
19/11/24 18:12:43 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/24 18:12:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/24 18:12:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 18:12:43 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestXZVdpS/job_c7fc6347-c1bb-47e6-b530-611580f87e8a/MANIFEST
19/11/24 18:12:43 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestXZVdpS/job_c7fc6347-c1bb-47e6-b530-611580f87e8a/MANIFEST -> 0 artifacts
19/11/24 18:12:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/24 18:12:44 INFO sdk_worker_main.main: Logging handler created.
19/11/24 18:12:44 INFO sdk_worker_main.start: Status HTTP server running at localhost:37509
19/11/24 18:12:44 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/24 18:12:44 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/24 18:12:44 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574619161.09_1af24fd7-e549-4087-8c79-b07dc7f43611', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/24 18:12:44 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574619161.09', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46255', 'job_port': u'0'}
19/11/24 18:12:44 INFO statecache.__init__: Creating state cache with size 0
19/11/24 18:12:44 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45279.
19/11/24 18:12:44 INFO sdk_worker.__init__: Control channel established.
19/11/24 18:12:44 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/24 18:12:44 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/24 18:12:44 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35491.
19/11/24 18:12:44 INFO sdk_worker.create_state_handler: State channel established.
19/11/24 18:12:44 INFO data_plane.create_data_channel: Creating client data channel for localhost:33641
19/11/24 18:12:44 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/24 18:12:44 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/24 18:12:44 INFO sdk_worker.run: No more requests from control plane
19/11/24 18:12:44 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/24 18:12:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 18:12:44 INFO data_plane.close: Closing all cached grpc data channels.
19/11/24 18:12:44 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/24 18:12:44 INFO sdk_worker.run: Done consuming work.
19/11/24 18:12:44 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/24 18:12:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/24 18:12:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 18:12:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestXZVdpS/job_c7fc6347-c1bb-47e6-b530-611580f87e8a/MANIFEST
19/11/24 18:12:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestXZVdpS/job_c7fc6347-c1bb-47e6-b530-611580f87e8a/MANIFEST -> 0 artifacts
19/11/24 18:12:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/24 18:12:45 INFO sdk_worker_main.main: Logging handler created.
19/11/24 18:12:45 INFO sdk_worker_main.start: Status HTTP server running at localhost:35971
19/11/24 18:12:45 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/24 18:12:45 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/24 18:12:45 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574619161.09_1af24fd7-e549-4087-8c79-b07dc7f43611', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/24 18:12:45 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574619161.09', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46255', 'job_port': u'0'}
19/11/24 18:12:45 INFO statecache.__init__: Creating state cache with size 0
19/11/24 18:12:45 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40125.
19/11/24 18:12:45 INFO sdk_worker.__init__: Control channel established.
19/11/24 18:12:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/24 18:12:45 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/24 18:12:45 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41997.
19/11/24 18:12:45 INFO sdk_worker.create_state_handler: State channel established.
19/11/24 18:12:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:37387
19/11/24 18:12:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/24 18:12:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/24 18:12:45 INFO sdk_worker.run: No more requests from control plane
19/11/24 18:12:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/24 18:12:45 INFO data_plane.close: Closing all cached grpc data channels.
19/11/24 18:12:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 18:12:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/24 18:12:45 INFO sdk_worker.run: Done consuming work.
19/11/24 18:12:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/24 18:12:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/24 18:12:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 18:12:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestXZVdpS/job_c7fc6347-c1bb-47e6-b530-611580f87e8a/MANIFEST
19/11/24 18:12:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestXZVdpS/job_c7fc6347-c1bb-47e6-b530-611580f87e8a/MANIFEST -> 0 artifacts
19/11/24 18:12:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/24 18:12:46 INFO sdk_worker_main.main: Logging handler created.
19/11/24 18:12:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:35331
19/11/24 18:12:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/24 18:12:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/24 18:12:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574619161.09_1af24fd7-e549-4087-8c79-b07dc7f43611', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/24 18:12:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574619161.09', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46255', 'job_port': u'0'}
19/11/24 18:12:46 INFO statecache.__init__: Creating state cache with size 0
19/11/24 18:12:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41813.
19/11/24 18:12:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/24 18:12:46 INFO sdk_worker.__init__: Control channel established.
19/11/24 18:12:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/24 18:12:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41893.
19/11/24 18:12:46 INFO sdk_worker.create_state_handler: State channel established.
19/11/24 18:12:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:36743
19/11/24 18:12:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/24 18:12:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/24 18:12:46 INFO sdk_worker.run: No more requests from control plane
19/11/24 18:12:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/24 18:12:46 INFO data_plane.close: Closing all cached grpc data channels.
19/11/24 18:12:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 18:12:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/24 18:12:46 INFO sdk_worker.run: Done consuming work.
19/11/24 18:12:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/24 18:12:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/24 18:12:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 18:12:46 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574619161.09_1af24fd7-e549-4087-8c79-b07dc7f43611 finished.
19/11/24 18:12:46 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/24 18:12:46 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestXZVdpS/job_c7fc6347-c1bb-47e6-b530-611580f87e8a/MANIFEST has 0 artifact locations
19/11/24 18:12:46 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestXZVdpS/job_c7fc6347-c1bb-47e6-b530-611580f87e8a/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
==================== Timed out after 60 seconds. ====================
  File "/usr/lib/python2.7/threading.py", line 359, in wait

# Thread: <Thread(wait_until_finish_read, started daemon 139969336645376)>

# Thread: <Thread(Thread-119, started daemon 139969345038080)>

# Thread: <_MainThread(MainThread, started 139970124269312)>
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
==================== Timed out after 60 seconds. ====================

BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139969310418688)>

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
# Thread: <Thread(Thread-123, started daemon 139969319073536)>

Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 139970124269312)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 139969336645376)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-119, started daemon 139969345038080)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574619152.07_2d65831b-8630-4358-acdd-273c7a1d6f36 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 297.423s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 32s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/xv5b346hajtom

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1616

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1616/display/redirect>

Changes:


------------------------------------------
[...truncated 1.34 MB...]
19/11/24 12:11:10 INFO sdk_worker.create_state_handler: State channel established.
19/11/24 12:11:10 INFO data_plane.create_data_channel: Creating client data channel for localhost:40955
19/11/24 12:11:10 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/24 12:11:10 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/24 12:11:10 INFO sdk_worker.run: No more requests from control plane
19/11/24 12:11:10 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/24 12:11:10 INFO data_plane.close: Closing all cached grpc data channels.
19/11/24 12:11:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 12:11:10 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/24 12:11:10 INFO sdk_worker.run: Done consuming work.
19/11/24 12:11:10 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/24 12:11:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/24 12:11:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 12:11:10 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestMQyxeX/job_44e3856b-6fb3-4db2-a992-3fc7f6a2441f/MANIFEST
19/11/24 12:11:10 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestMQyxeX/job_44e3856b-6fb3-4db2-a992-3fc7f6a2441f/MANIFEST -> 0 artifacts
19/11/24 12:11:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/24 12:11:11 INFO sdk_worker_main.main: Logging handler created.
19/11/24 12:11:11 INFO sdk_worker_main.start: Status HTTP server running at localhost:38709
19/11/24 12:11:11 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/24 12:11:11 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/24 12:11:11 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574597469.08_551ab857-b6cc-4f06-8a88-63263031982b', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/24 12:11:11 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574597469.08', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50651', 'job_port': u'0'}
19/11/24 12:11:11 INFO statecache.__init__: Creating state cache with size 0
19/11/24 12:11:11 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34379.
19/11/24 12:11:11 INFO sdk_worker.__init__: Control channel established.
19/11/24 12:11:11 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/24 12:11:11 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/24 12:11:11 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37337.
19/11/24 12:11:11 INFO sdk_worker.create_state_handler: State channel established.
19/11/24 12:11:11 INFO data_plane.create_data_channel: Creating client data channel for localhost:34335
19/11/24 12:11:11 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/24 12:11:11 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/24 12:11:11 INFO sdk_worker.run: No more requests from control plane
19/11/24 12:11:11 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/24 12:11:11 INFO data_plane.close: Closing all cached grpc data channels.
19/11/24 12:11:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 12:11:11 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/24 12:11:11 INFO sdk_worker.run: Done consuming work.
19/11/24 12:11:11 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/24 12:11:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/24 12:11:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 12:11:11 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestMQyxeX/job_44e3856b-6fb3-4db2-a992-3fc7f6a2441f/MANIFEST
19/11/24 12:11:11 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestMQyxeX/job_44e3856b-6fb3-4db2-a992-3fc7f6a2441f/MANIFEST -> 0 artifacts
19/11/24 12:11:12 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/24 12:11:12 INFO sdk_worker_main.main: Logging handler created.
19/11/24 12:11:12 INFO sdk_worker_main.start: Status HTTP server running at localhost:39549
19/11/24 12:11:12 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/24 12:11:12 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/24 12:11:12 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574597469.08_551ab857-b6cc-4f06-8a88-63263031982b', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/24 12:11:12 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574597469.08', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50651', 'job_port': u'0'}
19/11/24 12:11:12 INFO statecache.__init__: Creating state cache with size 0
19/11/24 12:11:12 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37753.
19/11/24 12:11:12 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/24 12:11:12 INFO sdk_worker.__init__: Control channel established.
19/11/24 12:11:12 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/24 12:11:12 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33921.
19/11/24 12:11:12 INFO sdk_worker.create_state_handler: State channel established.
19/11/24 12:11:12 INFO data_plane.create_data_channel: Creating client data channel for localhost:42933
19/11/24 12:11:12 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/24 12:11:12 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/24 12:11:12 INFO sdk_worker.run: No more requests from control plane
19/11/24 12:11:12 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/24 12:11:12 INFO data_plane.close: Closing all cached grpc data channels.
19/11/24 12:11:12 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 12:11:12 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/24 12:11:12 INFO sdk_worker.run: Done consuming work.
19/11/24 12:11:12 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/24 12:11:12 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/24 12:11:12 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 12:11:12 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestMQyxeX/job_44e3856b-6fb3-4db2-a992-3fc7f6a2441f/MANIFEST
19/11/24 12:11:12 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestMQyxeX/job_44e3856b-6fb3-4db2-a992-3fc7f6a2441f/MANIFEST -> 0 artifacts
19/11/24 12:11:13 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/24 12:11:13 INFO sdk_worker_main.main: Logging handler created.
19/11/24 12:11:13 INFO sdk_worker_main.start: Status HTTP server running at localhost:44357
19/11/24 12:11:13 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/24 12:11:13 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/24 12:11:13 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574597469.08_551ab857-b6cc-4f06-8a88-63263031982b', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/24 12:11:13 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574597469.08', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50651', 'job_port': u'0'}
19/11/24 12:11:13 INFO statecache.__init__: Creating state cache with size 0
19/11/24 12:11:13 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34461.
19/11/24 12:11:13 INFO sdk_worker.__init__: Control channel established.
19/11/24 12:11:13 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/24 12:11:13 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/24 12:11:13 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34411.
19/11/24 12:11:13 INFO sdk_worker.create_state_handler: State channel established.
19/11/24 12:11:13 INFO data_plane.create_data_channel: Creating client data channel for localhost:34799
19/11/24 12:11:13 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/24 12:11:13 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/24 12:11:13 INFO sdk_worker.run: No more requests from control plane
19/11/24 12:11:13 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/24 12:11:13 INFO data_plane.close: Closing all cached grpc data channels.
19/11/24 12:11:13 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 12:11:13 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/24 12:11:13 INFO sdk_worker.run: Done consuming work.
19/11/24 12:11:13 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/24 12:11:13 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/24 12:11:13 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 12:11:13 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestMQyxeX/job_44e3856b-6fb3-4db2-a992-3fc7f6a2441f/MANIFEST
19/11/24 12:11:13 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestMQyxeX/job_44e3856b-6fb3-4db2-a992-3fc7f6a2441f/MANIFEST -> 0 artifacts
19/11/24 12:11:13 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/24 12:11:13 INFO sdk_worker_main.main: Logging handler created.
19/11/24 12:11:13 INFO sdk_worker_main.start: Status HTTP server running at localhost:41313
19/11/24 12:11:13 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/24 12:11:13 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/24 12:11:13 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574597469.08_551ab857-b6cc-4f06-8a88-63263031982b', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/24 12:11:13 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574597469.08', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50651', 'job_port': u'0'}
19/11/24 12:11:13 INFO statecache.__init__: Creating state cache with size 0
19/11/24 12:11:13 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43763.
19/11/24 12:11:14 INFO sdk_worker.__init__: Control channel established.
19/11/24 12:11:14 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/24 12:11:14 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/24 12:11:14 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35031.
19/11/24 12:11:14 INFO sdk_worker.create_state_handler: State channel established.
19/11/24 12:11:14 INFO data_plane.create_data_channel: Creating client data channel for localhost:44977
19/11/24 12:11:14 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/24 12:11:14 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/24 12:11:14 INFO sdk_worker.run: No more requests from control plane
19/11/24 12:11:14 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/24 12:11:14 INFO data_plane.close: Closing all cached grpc data channels.
19/11/24 12:11:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 12:11:14 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/24 12:11:14 INFO sdk_worker.run: Done consuming work.
19/11/24 12:11:14 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/24 12:11:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/24 12:11:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 12:11:14 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574597469.08_551ab857-b6cc-4f06-8a88-63263031982b finished.
19/11/24 12:11:14 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/24 12:11:14 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestMQyxeX/job_44e3856b-6fb3-4db2-a992-3fc7f6a2441f/MANIFEST has 0 artifact locations
19/11/24 12:11:14 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestMQyxeX/job_44e3856b-6fb3-4db2-a992-3fc7f6a2441f/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139766616024832)>


======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
# Thread: <Thread(Thread-120, started daemon 139766599239424)>

Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 139767605778176)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 139766582454016)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)

# Thread: <Thread(Thread-124, started daemon 139766590846720)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 139767605778176)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574597460.03_6c0533db-2e6a-485a-8a8d-667b87e204de failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 286.873s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 37s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/7mdpvxdw4f5qc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1615

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1615/display/redirect>

Changes:


------------------------------------------
[...truncated 1.34 MB...]
19/11/24 06:11:33 INFO sdk_worker.create_state_handler: State channel established.
19/11/24 06:11:33 INFO data_plane.create_data_channel: Creating client data channel for localhost:38865
19/11/24 06:11:33 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/24 06:11:33 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/24 06:11:33 INFO sdk_worker.run: No more requests from control plane
19/11/24 06:11:33 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/24 06:11:33 INFO data_plane.close: Closing all cached grpc data channels.
19/11/24 06:11:33 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/24 06:11:33 INFO sdk_worker.run: Done consuming work.
19/11/24 06:11:33 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/24 06:11:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 06:11:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/24 06:11:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 06:11:33 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestZzpULq/job_8a66d48a-2b20-4cfe-b72e-51112a973ea0/MANIFEST
19/11/24 06:11:33 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestZzpULq/job_8a66d48a-2b20-4cfe-b72e-51112a973ea0/MANIFEST -> 0 artifacts
19/11/24 06:11:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/24 06:11:33 INFO sdk_worker_main.main: Logging handler created.
19/11/24 06:11:33 INFO sdk_worker_main.start: Status HTTP server running at localhost:36859
19/11/24 06:11:33 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/24 06:11:33 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/24 06:11:33 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574575891.47_25c81252-c0a4-44de-afde-871cdceb25cf', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/24 06:11:33 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574575891.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:38229', 'job_port': u'0'}
19/11/24 06:11:33 INFO statecache.__init__: Creating state cache with size 0
19/11/24 06:11:33 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35579.
19/11/24 06:11:33 INFO sdk_worker.__init__: Control channel established.
19/11/24 06:11:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/24 06:11:33 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/24 06:11:33 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40247.
19/11/24 06:11:33 INFO sdk_worker.create_state_handler: State channel established.
19/11/24 06:11:34 INFO data_plane.create_data_channel: Creating client data channel for localhost:40585
19/11/24 06:11:34 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/24 06:11:34 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/24 06:11:34 INFO sdk_worker.run: No more requests from control plane
19/11/24 06:11:34 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/24 06:11:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 06:11:34 INFO data_plane.close: Closing all cached grpc data channels.
19/11/24 06:11:34 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/24 06:11:34 INFO sdk_worker.run: Done consuming work.
19/11/24 06:11:34 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/24 06:11:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/24 06:11:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 06:11:34 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestZzpULq/job_8a66d48a-2b20-4cfe-b72e-51112a973ea0/MANIFEST
19/11/24 06:11:34 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestZzpULq/job_8a66d48a-2b20-4cfe-b72e-51112a973ea0/MANIFEST -> 0 artifacts
19/11/24 06:11:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/24 06:11:34 INFO sdk_worker_main.main: Logging handler created.
19/11/24 06:11:34 INFO sdk_worker_main.start: Status HTTP server running at localhost:38825
19/11/24 06:11:34 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/24 06:11:34 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/24 06:11:34 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574575891.47_25c81252-c0a4-44de-afde-871cdceb25cf', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/24 06:11:34 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574575891.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:38229', 'job_port': u'0'}
19/11/24 06:11:34 INFO statecache.__init__: Creating state cache with size 0
19/11/24 06:11:34 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33371.
19/11/24 06:11:34 INFO sdk_worker.__init__: Control channel established.
19/11/24 06:11:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/24 06:11:34 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/24 06:11:34 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38409.
19/11/24 06:11:34 INFO sdk_worker.create_state_handler: State channel established.
19/11/24 06:11:34 INFO data_plane.create_data_channel: Creating client data channel for localhost:37943
19/11/24 06:11:34 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/24 06:11:35 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/24 06:11:35 INFO sdk_worker.run: No more requests from control plane
19/11/24 06:11:35 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/24 06:11:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 06:11:35 INFO data_plane.close: Closing all cached grpc data channels.
19/11/24 06:11:35 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/24 06:11:35 INFO sdk_worker.run: Done consuming work.
19/11/24 06:11:35 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/24 06:11:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/24 06:11:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 06:11:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestZzpULq/job_8a66d48a-2b20-4cfe-b72e-51112a973ea0/MANIFEST
19/11/24 06:11:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestZzpULq/job_8a66d48a-2b20-4cfe-b72e-51112a973ea0/MANIFEST -> 0 artifacts
19/11/24 06:11:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/24 06:11:35 INFO sdk_worker_main.main: Logging handler created.
19/11/24 06:11:35 INFO sdk_worker_main.start: Status HTTP server running at localhost:34621
19/11/24 06:11:35 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/24 06:11:35 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/24 06:11:35 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574575891.47_25c81252-c0a4-44de-afde-871cdceb25cf', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/24 06:11:35 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574575891.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:38229', 'job_port': u'0'}
19/11/24 06:11:35 INFO statecache.__init__: Creating state cache with size 0
19/11/24 06:11:35 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45995.
19/11/24 06:11:35 INFO sdk_worker.__init__: Control channel established.
19/11/24 06:11:35 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/24 06:11:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/24 06:11:35 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38679.
19/11/24 06:11:35 INFO sdk_worker.create_state_handler: State channel established.
19/11/24 06:11:35 INFO data_plane.create_data_channel: Creating client data channel for localhost:37431
19/11/24 06:11:35 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/24 06:11:35 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/24 06:11:35 INFO sdk_worker.run: No more requests from control plane
19/11/24 06:11:35 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/24 06:11:35 INFO data_plane.close: Closing all cached grpc data channels.
19/11/24 06:11:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 06:11:35 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/24 06:11:35 INFO sdk_worker.run: Done consuming work.
19/11/24 06:11:35 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/24 06:11:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/24 06:11:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 06:11:36 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestZzpULq/job_8a66d48a-2b20-4cfe-b72e-51112a973ea0/MANIFEST
19/11/24 06:11:36 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestZzpULq/job_8a66d48a-2b20-4cfe-b72e-51112a973ea0/MANIFEST -> 0 artifacts
19/11/24 06:11:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/24 06:11:36 INFO sdk_worker_main.main: Logging handler created.
19/11/24 06:11:36 INFO sdk_worker_main.start: Status HTTP server running at localhost:46257
19/11/24 06:11:36 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/24 06:11:36 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/24 06:11:36 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574575891.47_25c81252-c0a4-44de-afde-871cdceb25cf', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/24 06:11:36 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574575891.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:38229', 'job_port': u'0'}
19/11/24 06:11:36 INFO statecache.__init__: Creating state cache with size 0
19/11/24 06:11:36 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37571.
19/11/24 06:11:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/24 06:11:36 INFO sdk_worker.__init__: Control channel established.
19/11/24 06:11:36 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/24 06:11:36 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33379.
19/11/24 06:11:36 INFO sdk_worker.create_state_handler: State channel established.
19/11/24 06:11:36 INFO data_plane.create_data_channel: Creating client data channel for localhost:37913
19/11/24 06:11:36 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/24 06:11:36 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/24 06:11:36 INFO sdk_worker.run: No more requests from control plane
19/11/24 06:11:36 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/24 06:11:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 06:11:36 INFO data_plane.close: Closing all cached grpc data channels.
19/11/24 06:11:36 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/24 06:11:36 INFO sdk_worker.run: Done consuming work.
19/11/24 06:11:36 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/24 06:11:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/24 06:11:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 06:11:36 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574575891.47_25c81252-c0a4-44de-afde-871cdceb25cf finished.
19/11/24 06:11:36 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/24 06:11:36 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestZzpULq/job_8a66d48a-2b20-4cfe-b72e-51112a973ea0/MANIFEST has 0 artifact locations
19/11/24 06:11:36 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestZzpULq/job_8a66d48a-2b20-4cfe-b72e-51112a973ea0/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
==================== Timed out after 60 seconds. ====================
----------------------------------------------------------------------

Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
# Thread: <Thread(wait_until_finish_read, started daemon 140422901139200)>

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-120, started daemon 140422892746496)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140423682541312)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
# Thread: <Thread(wait_until_finish_read, started daemon 140422462560000)>

    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
# Thread: <Thread(Thread-126, started daemon 140422470952704)>

    self.run().wait_until_finish()
  File "apach# Thread: <_MainThread(MainThread, started 140423682541312)>
e_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574575882.25_00ec24c7-6f9c-4964-8ddd-21bfe0b918b8 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 300.071s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 42s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/ws7ohyjzqpisa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1614

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1614/display/redirect?page=changes>

Changes:

[kenn] Use only standard table resolution for ZetaSQL dialect


------------------------------------------
[...truncated 1.34 MB...]
19/11/24 03:58:14 INFO sdk_worker.create_state_handler: State channel established.
19/11/24 03:58:14 INFO data_plane.create_data_channel: Creating client data channel for localhost:46051
19/11/24 03:58:14 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/24 03:58:14 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/24 03:58:14 INFO sdk_worker.run: No more requests from control plane
19/11/24 03:58:14 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/24 03:58:14 INFO data_plane.close: Closing all cached grpc data channels.
19/11/24 03:58:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 03:58:14 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/24 03:58:14 INFO sdk_worker.run: Done consuming work.
19/11/24 03:58:14 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/24 03:58:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/24 03:58:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 03:58:14 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestiMByAz/job_a041f1f2-c5df-4b13-ae6d-9b9ed7b36ea5/MANIFEST
19/11/24 03:58:14 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestiMByAz/job_a041f1f2-c5df-4b13-ae6d-9b9ed7b36ea5/MANIFEST -> 0 artifacts
19/11/24 03:58:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/24 03:58:15 INFO sdk_worker_main.main: Logging handler created.
19/11/24 03:58:15 INFO sdk_worker_main.start: Status HTTP server running at localhost:46377
19/11/24 03:58:15 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/24 03:58:15 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/24 03:58:15 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574567892.75_3ec6ed89-d9e0-48e9-8d95-f5557c746095', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/24 03:58:15 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574567892.75', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52835', 'job_port': u'0'}
19/11/24 03:58:15 INFO statecache.__init__: Creating state cache with size 0
19/11/24 03:58:15 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44649.
19/11/24 03:58:15 INFO sdk_worker.__init__: Control channel established.
19/11/24 03:58:15 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/24 03:58:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/24 03:58:15 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39763.
19/11/24 03:58:15 INFO sdk_worker.create_state_handler: State channel established.
19/11/24 03:58:15 INFO data_plane.create_data_channel: Creating client data channel for localhost:35489
19/11/24 03:58:15 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/24 03:58:15 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/24 03:58:15 INFO sdk_worker.run: No more requests from control plane
19/11/24 03:58:15 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/24 03:58:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 03:58:15 INFO data_plane.close: Closing all cached grpc data channels.
19/11/24 03:58:15 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/24 03:58:15 INFO sdk_worker.run: Done consuming work.
19/11/24 03:58:15 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/24 03:58:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/24 03:58:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 03:58:15 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestiMByAz/job_a041f1f2-c5df-4b13-ae6d-9b9ed7b36ea5/MANIFEST
19/11/24 03:58:15 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestiMByAz/job_a041f1f2-c5df-4b13-ae6d-9b9ed7b36ea5/MANIFEST -> 0 artifacts
19/11/24 03:58:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/24 03:58:16 INFO sdk_worker_main.main: Logging handler created.
19/11/24 03:58:16 INFO sdk_worker_main.start: Status HTTP server running at localhost:35985
19/11/24 03:58:16 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/24 03:58:16 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/24 03:58:16 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574567892.75_3ec6ed89-d9e0-48e9-8d95-f5557c746095', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/24 03:58:16 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574567892.75', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52835', 'job_port': u'0'}
19/11/24 03:58:16 INFO statecache.__init__: Creating state cache with size 0
19/11/24 03:58:16 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42509.
19/11/24 03:58:16 INFO sdk_worker.__init__: Control channel established.
19/11/24 03:58:16 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/24 03:58:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/24 03:58:16 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45987.
19/11/24 03:58:16 INFO sdk_worker.create_state_handler: State channel established.
19/11/24 03:58:16 INFO data_plane.create_data_channel: Creating client data channel for localhost:45045
19/11/24 03:58:16 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/24 03:58:16 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/24 03:58:16 INFO sdk_worker.run: No more requests from control plane
19/11/24 03:58:16 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/24 03:58:16 INFO data_plane.close: Closing all cached grpc data channels.
19/11/24 03:58:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 03:58:16 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/24 03:58:16 INFO sdk_worker.run: Done consuming work.
19/11/24 03:58:16 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/24 03:58:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/24 03:58:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 03:58:16 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestiMByAz/job_a041f1f2-c5df-4b13-ae6d-9b9ed7b36ea5/MANIFEST
19/11/24 03:58:16 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestiMByAz/job_a041f1f2-c5df-4b13-ae6d-9b9ed7b36ea5/MANIFEST -> 0 artifacts
19/11/24 03:58:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/24 03:58:17 INFO sdk_worker_main.main: Logging handler created.
19/11/24 03:58:17 INFO sdk_worker_main.start: Status HTTP server running at localhost:43211
19/11/24 03:58:17 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/24 03:58:17 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/24 03:58:17 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574567892.75_3ec6ed89-d9e0-48e9-8d95-f5557c746095', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/24 03:58:17 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574567892.75', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52835', 'job_port': u'0'}
19/11/24 03:58:17 INFO statecache.__init__: Creating state cache with size 0
19/11/24 03:58:17 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39777.
19/11/24 03:58:17 INFO sdk_worker.__init__: Control channel established.
19/11/24 03:58:17 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/24 03:58:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/24 03:58:17 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46331.
19/11/24 03:58:17 INFO sdk_worker.create_state_handler: State channel established.
19/11/24 03:58:17 INFO data_plane.create_data_channel: Creating client data channel for localhost:42781
19/11/24 03:58:17 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/24 03:58:17 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/24 03:58:17 INFO sdk_worker.run: No more requests from control plane
19/11/24 03:58:17 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/24 03:58:17 INFO data_plane.close: Closing all cached grpc data channels.
19/11/24 03:58:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 03:58:17 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/24 03:58:17 INFO sdk_worker.run: Done consuming work.
19/11/24 03:58:17 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/24 03:58:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/24 03:58:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 03:58:17 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestiMByAz/job_a041f1f2-c5df-4b13-ae6d-9b9ed7b36ea5/MANIFEST
19/11/24 03:58:17 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestiMByAz/job_a041f1f2-c5df-4b13-ae6d-9b9ed7b36ea5/MANIFEST -> 0 artifacts
19/11/24 03:58:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/24 03:58:18 INFO sdk_worker_main.main: Logging handler created.
19/11/24 03:58:18 INFO sdk_worker_main.start: Status HTTP server running at localhost:34797
19/11/24 03:58:18 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/24 03:58:18 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/24 03:58:18 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574567892.75_3ec6ed89-d9e0-48e9-8d95-f5557c746095', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/24 03:58:18 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574567892.75', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52835', 'job_port': u'0'}
19/11/24 03:58:18 INFO statecache.__init__: Creating state cache with size 0
19/11/24 03:58:18 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38209.
19/11/24 03:58:18 INFO sdk_worker.__init__: Control channel established.
19/11/24 03:58:18 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/24 03:58:18 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/24 03:58:18 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41965.
19/11/24 03:58:18 INFO sdk_worker.create_state_handler: State channel established.
19/11/24 03:58:18 INFO data_plane.create_data_channel: Creating client data channel for localhost:45939
19/11/24 03:58:18 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/24 03:58:18 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/24 03:58:18 INFO sdk_worker.run: No more requests from control plane
19/11/24 03:58:18 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/24 03:58:18 INFO data_plane.close: Closing all cached grpc data channels.
19/11/24 03:58:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 03:58:18 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/24 03:58:18 INFO sdk_worker.run: Done consuming work.
19/11/24 03:58:18 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/24 03:58:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/24 03:58:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 03:58:18 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574567892.75_3ec6ed89-d9e0-48e9-8d95-f5557c746095 finished.
19/11/24 03:58:18 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/24 03:58:18 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestiMByAz/job_a041f1f2-c5df-4b13-ae6d-9b9ed7b36ea5/MANIFEST has 0 artifact locations
19/11/24 03:58:18 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestiMByAz/job_a041f1f2-c5df-4b13-ae6d-9b9ed7b36ea5/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140199543699200)>

# Thread: <Thread(Thread-117, started daemon 140199535306496)>

Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
# Thread: <_MainThread(MainThread, started 140200666097408)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 140199510128384)>

# Thread: <Thread(Thread-123, started daemon 140199518521088)>

# Thread: <_MainThread(MainThread, started 140200666097408)>
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574567882.47_2291ad06-42c5-4d77-9136-b7e2bbf9d468 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 315.090s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 14s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/vpl3dm443t37c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1613

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1613/display/redirect>

Changes:


------------------------------------------
[...truncated 1.35 MB...]

19/11/24 00:14:35 INFO sdk_worker.run: No more requests from control plane
19/11/24 00:14:35 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/24 00:14:35 INFO data_plane.close: Closing all cached grpc data channels.
19/11/24 00:14:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 00:14:35 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/24 00:14:35 INFO sdk_worker.run: Done consuming work.
19/11/24 00:14:35 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/24 00:14:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/24 00:14:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 00:14:36 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest7zxg15/job_61fe00f9-b337-4eb3-9f08-0d94e675c187/MANIFEST
19/11/24 00:14:36 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest7zxg15/job_61fe00f9-b337-4eb3-9f08-0d94e675c187/MANIFEST -> 0 artifacts
19/11/24 00:14:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/24 00:14:36 INFO sdk_worker_main.main: Logging handler created.
19/11/24 00:14:36 INFO sdk_worker_main.start: Status HTTP server running at localhost:34577
19/11/24 00:14:36 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/24 00:14:36 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/24 00:14:36 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574554472.03_9f171d54-446f-42d7-be59-f9af14e6ef3f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/24 00:14:36 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574554472.03', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54179', 'job_port': u'0'}
19/11/24 00:14:36 INFO statecache.__init__: Creating state cache with size 0
19/11/24 00:14:36 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43177.
19/11/24 00:14:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/24 00:14:36 INFO sdk_worker.__init__: Control channel established.
19/11/24 00:14:36 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/24 00:14:36 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43111.
19/11/24 00:14:36 INFO sdk_worker.create_state_handler: State channel established.
19/11/24 00:14:36 INFO data_plane.create_data_channel: Creating client data channel for localhost:35147
19/11/24 00:14:36 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/24 00:14:36 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/24 00:14:37 INFO sdk_worker.run: No more requests from control plane
19/11/24 00:14:37 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/24 00:14:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 00:14:37 INFO data_plane.close: Closing all cached grpc data channels.
19/11/24 00:14:37 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/24 00:14:37 INFO sdk_worker.run: Done consuming work.
19/11/24 00:14:37 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/24 00:14:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/24 00:14:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 00:14:37 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest7zxg15/job_61fe00f9-b337-4eb3-9f08-0d94e675c187/MANIFEST
19/11/24 00:14:37 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest7zxg15/job_61fe00f9-b337-4eb3-9f08-0d94e675c187/MANIFEST -> 0 artifacts
19/11/24 00:14:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/24 00:14:38 INFO sdk_worker_main.main: Logging handler created.
19/11/24 00:14:38 INFO sdk_worker_main.start: Status HTTP server running at localhost:37455
19/11/24 00:14:38 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/24 00:14:38 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/24 00:14:38 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574554472.03_9f171d54-446f-42d7-be59-f9af14e6ef3f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/24 00:14:38 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574554472.03', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54179', 'job_port': u'0'}
19/11/24 00:14:38 INFO statecache.__init__: Creating state cache with size 0
19/11/24 00:14:38 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44353.
19/11/24 00:14:38 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/24 00:14:38 INFO sdk_worker.__init__: Control channel established.
19/11/24 00:14:38 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/24 00:14:38 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33107.
19/11/24 00:14:38 INFO sdk_worker.create_state_handler: State channel established.
19/11/24 00:14:38 INFO data_plane.create_data_channel: Creating client data channel for localhost:35999
19/11/24 00:14:38 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/24 00:14:38 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/24 00:14:38 INFO sdk_worker.run: No more requests from control plane
19/11/24 00:14:38 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/24 00:14:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 00:14:38 INFO data_plane.close: Closing all cached grpc data channels.
19/11/24 00:14:38 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/24 00:14:38 INFO sdk_worker.run: Done consuming work.
19/11/24 00:14:38 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/24 00:14:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/24 00:14:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 00:14:38 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest7zxg15/job_61fe00f9-b337-4eb3-9f08-0d94e675c187/MANIFEST
19/11/24 00:14:38 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest7zxg15/job_61fe00f9-b337-4eb3-9f08-0d94e675c187/MANIFEST -> 0 artifacts
19/11/24 00:14:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/24 00:14:39 INFO sdk_worker_main.main: Logging handler created.
19/11/24 00:14:39 INFO sdk_worker_main.start: Status HTTP server running at localhost:34001
19/11/24 00:14:39 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/24 00:14:39 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/24 00:14:39 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574554472.03_9f171d54-446f-42d7-be59-f9af14e6ef3f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/24 00:14:39 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574554472.03', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54179', 'job_port': u'0'}
19/11/24 00:14:39 INFO statecache.__init__: Creating state cache with size 0
19/11/24 00:14:39 INFO sdk_worker.__init__: Creating insecure control channel for localhost:32909.
19/11/24 00:14:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/24 00:14:39 INFO sdk_worker.__init__: Control channel established.
19/11/24 00:14:39 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/24 00:14:39 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38971.
19/11/24 00:14:39 INFO sdk_worker.create_state_handler: State channel established.
19/11/24 00:14:39 INFO data_plane.create_data_channel: Creating client data channel for localhost:34647
19/11/24 00:14:39 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/24 00:14:39 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/24 00:14:39 INFO sdk_worker.run: No more requests from control plane
19/11/24 00:14:39 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/24 00:14:39 INFO data_plane.close: Closing all cached grpc data channels.
19/11/24 00:14:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 00:14:39 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/24 00:14:39 INFO sdk_worker.run: Done consuming work.
19/11/24 00:14:39 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/24 00:14:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/24 00:14:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/24 00:14:39 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574554472.03_9f171d54-446f-42d7-be59-f9af14e6ef3f finished.
19/11/24 00:14:39 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/24 00:14:39 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktest7zxg15/job_61fe00f9-b337-4eb3-9f08-0d94e675c187/MANIFEST has 0 artifact locations
19/11/24 00:14:39 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktest7zxg15/job_61fe00f9-b337-4eb3-9f08-0d94e675c187/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_unfusable_side_inputs (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 254, in test_pardo_unfusable_side_inputs
    equal_to([('a', 'a'), ('a', 'b'), ('b', 'a'), ('b', 'b')]))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574554458.97_71d82af0-9bc1-49b4-aaef-ea7164b3ac72 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 385.522s

FAILED (errors=4, skipped=9)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140216672581376)>

# Thread: <Thread(Thread-120, started daemon 140216681236224)>

# Thread: <_MainThread(MainThread, started 140217467799296)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140216178304768)>

# Thread: <Thread(Thread-126, started daemon 140216186697472)>

# Thread: <_MainThread(MainThread, started 140217467799296)>

# Thread: <Thread(Thread-120, started daemon 140216681236224)>

# Thread: <Thread(wait_until_finish_read, started daemon 140216672581376)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140216178304768)>

# Thread: <Thread(Thread-135, started daemon 140216169912064)>

# Thread: <_MainThread(MainThread, started 140217467799296)>

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 58s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/22i76ymbcr2ra

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1612

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1612/display/redirect>

Changes:


------------------------------------------
[...truncated 1.34 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/23 18:13:14 INFO sdk_worker.run: No more requests from control plane
19/11/23 18:13:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 18:13:14 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/23 18:13:14 INFO data_plane.close: Closing all cached grpc data channels.
19/11/23 18:13:14 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/23 18:13:14 INFO sdk_worker.run: Done consuming work.
19/11/23 18:13:14 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/23 18:13:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/23 18:13:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 18:13:14 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestFXGG9W/job_f0580234-c30d-425e-b91f-4a8c05f2a187/MANIFEST
19/11/23 18:13:14 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestFXGG9W/job_f0580234-c30d-425e-b91f-4a8c05f2a187/MANIFEST -> 0 artifacts
19/11/23 18:13:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/23 18:13:15 INFO sdk_worker_main.main: Logging handler created.
19/11/23 18:13:15 INFO sdk_worker_main.start: Status HTTP server running at localhost:36061
19/11/23 18:13:15 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/23 18:13:15 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/23 18:13:15 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574532791.91_0325ee05-5d7b-4235-945c-66cace75fad6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/23 18:13:15 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574532791.91', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56211', 'job_port': u'0'}
19/11/23 18:13:15 INFO statecache.__init__: Creating state cache with size 0
19/11/23 18:13:15 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40201.
19/11/23 18:13:15 INFO sdk_worker.__init__: Control channel established.
19/11/23 18:13:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/23 18:13:15 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/23 18:13:15 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44659.
19/11/23 18:13:15 INFO sdk_worker.create_state_handler: State channel established.
19/11/23 18:13:15 INFO data_plane.create_data_channel: Creating client data channel for localhost:40381
19/11/23 18:13:15 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/23 18:13:15 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/23 18:13:15 INFO sdk_worker.run: No more requests from control plane
19/11/23 18:13:15 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/23 18:13:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 18:13:15 INFO data_plane.close: Closing all cached grpc data channels.
19/11/23 18:13:15 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/23 18:13:15 INFO sdk_worker.run: Done consuming work.
19/11/23 18:13:15 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/23 18:13:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/23 18:13:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 18:13:16 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestFXGG9W/job_f0580234-c30d-425e-b91f-4a8c05f2a187/MANIFEST
19/11/23 18:13:16 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestFXGG9W/job_f0580234-c30d-425e-b91f-4a8c05f2a187/MANIFEST -> 0 artifacts
19/11/23 18:13:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/23 18:13:16 INFO sdk_worker_main.main: Logging handler created.
19/11/23 18:13:16 INFO sdk_worker_main.start: Status HTTP server running at localhost:39515
19/11/23 18:13:16 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/23 18:13:16 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/23 18:13:16 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574532791.91_0325ee05-5d7b-4235-945c-66cace75fad6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/23 18:13:16 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574532791.91', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56211', 'job_port': u'0'}
19/11/23 18:13:16 INFO statecache.__init__: Creating state cache with size 0
19/11/23 18:13:16 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38359.
19/11/23 18:13:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/23 18:13:16 INFO sdk_worker.__init__: Control channel established.
19/11/23 18:13:16 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/23 18:13:16 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43827.
19/11/23 18:13:16 INFO sdk_worker.create_state_handler: State channel established.
19/11/23 18:13:16 INFO data_plane.create_data_channel: Creating client data channel for localhost:38471
19/11/23 18:13:16 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/23 18:13:16 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/23 18:13:16 INFO sdk_worker.run: No more requests from control plane
19/11/23 18:13:16 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/23 18:13:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 18:13:17 INFO data_plane.close: Closing all cached grpc data channels.
19/11/23 18:13:17 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/23 18:13:17 INFO sdk_worker.run: Done consuming work.
19/11/23 18:13:17 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/23 18:13:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/23 18:13:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 18:13:17 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestFXGG9W/job_f0580234-c30d-425e-b91f-4a8c05f2a187/MANIFEST
19/11/23 18:13:17 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestFXGG9W/job_f0580234-c30d-425e-b91f-4a8c05f2a187/MANIFEST -> 0 artifacts
19/11/23 18:13:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/23 18:13:18 INFO sdk_worker_main.main: Logging handler created.
19/11/23 18:13:18 INFO sdk_worker_main.start: Status HTTP server running at localhost:35861
19/11/23 18:13:18 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/23 18:13:18 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/23 18:13:18 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574532791.91_0325ee05-5d7b-4235-945c-66cace75fad6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/23 18:13:18 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574532791.91', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56211', 'job_port': u'0'}
19/11/23 18:13:18 INFO statecache.__init__: Creating state cache with size 0
19/11/23 18:13:18 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34469.
19/11/23 18:13:18 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/23 18:13:18 INFO sdk_worker.__init__: Control channel established.
19/11/23 18:13:18 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/23 18:13:18 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34005.
19/11/23 18:13:18 INFO sdk_worker.create_state_handler: State channel established.
19/11/23 18:13:18 INFO data_plane.create_data_channel: Creating client data channel for localhost:35777
19/11/23 18:13:18 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/23 18:13:18 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/23 18:13:18 INFO sdk_worker.run: No more requests from control plane
19/11/23 18:13:18 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/23 18:13:18 INFO data_plane.close: Closing all cached grpc data channels.
19/11/23 18:13:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 18:13:18 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/23 18:13:18 INFO sdk_worker.run: Done consuming work.
19/11/23 18:13:18 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/23 18:13:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/23 18:13:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 18:13:18 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestFXGG9W/job_f0580234-c30d-425e-b91f-4a8c05f2a187/MANIFEST
19/11/23 18:13:18 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestFXGG9W/job_f0580234-c30d-425e-b91f-4a8c05f2a187/MANIFEST -> 0 artifacts
19/11/23 18:13:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/23 18:13:19 INFO sdk_worker_main.main: Logging handler created.
19/11/23 18:13:19 INFO sdk_worker_main.start: Status HTTP server running at localhost:40955
19/11/23 18:13:19 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/23 18:13:19 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/23 18:13:19 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574532791.91_0325ee05-5d7b-4235-945c-66cace75fad6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/23 18:13:19 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574532791.91', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56211', 'job_port': u'0'}
19/11/23 18:13:19 INFO statecache.__init__: Creating state cache with size 0
19/11/23 18:13:19 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39989.
19/11/23 18:13:19 INFO sdk_worker.__init__: Control channel established.
19/11/23 18:13:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/23 18:13:19 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/23 18:13:19 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46459.
19/11/23 18:13:19 INFO sdk_worker.create_state_handler: State channel established.
19/11/23 18:13:19 INFO data_plane.create_data_channel: Creating client data channel for localhost:41469
19/11/23 18:13:19 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/23 18:13:19 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/23 18:13:19 INFO sdk_worker.run: No more requests from control plane
19/11/23 18:13:19 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/23 18:13:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 18:13:19 INFO data_plane.close: Closing all cached grpc data channels.
19/11/23 18:13:19 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/23 18:13:19 INFO sdk_worker.run: Done consuming work.
19/11/23 18:13:19 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/23 18:13:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/23 18:13:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 18:13:19 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574532791.91_0325ee05-5d7b-4235-945c-66cace75fad6 finished.
19/11/23 18:13:19 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/23 18:13:19 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestFXGG9W/job_f0580234-c30d-425e-b91f-4a8c05f2a187/MANIFEST has 0 artifact locations
19/11/23 18:13:19 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestFXGG9W/job_f0580234-c30d-425e-b91f-4a8c05f2a187/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140569699407616)>

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
# Thread: <Thread(Thread-120, started daemon 140569707800320)>

    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 140570692482816)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140569674229504)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-126, started daemon 140569682622208)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
# Thread: <_MainThread(MainThread, started 140570692482816)>

    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-120, started daemon 140569707800320)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 140569699407616)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574532780.4_71749cf0-d4b9-43ea-b5cf-ffd4b0eedc0d failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 311.405s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 12s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://gradle.com/s/s6l7cm23txqti

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1611

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1611/display/redirect>

Changes:


------------------------------------------
[...truncated 1.34 MB...]
19/11/23 12:12:43 INFO sdk_worker_main.main: Logging handler created.
19/11/23 12:12:43 INFO sdk_worker_main.start: Status HTTP server running at localhost:46467
19/11/23 12:12:43 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/23 12:12:43 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/23 12:12:43 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574511160.64_f5011495-0281-4cc0-829c-4ec2accbbc13', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/23 12:12:43 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574511160.64', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:32799', 'job_port': u'0'}
19/11/23 12:12:43 INFO statecache.__init__: Creating state cache with size 0
19/11/23 12:12:43 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38849.
19/11/23 12:12:43 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/23 12:12:43 INFO sdk_worker.__init__: Control channel established.
19/11/23 12:12:43 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/23 12:12:43 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46019.
19/11/23 12:12:43 INFO sdk_worker.create_state_handler: State channel established.
19/11/23 12:12:43 INFO data_plane.create_data_channel: Creating client data channel for localhost:46781
19/11/23 12:12:43 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/23 12:12:43 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/23 12:12:43 INFO sdk_worker.run: No more requests from control plane
19/11/23 12:12:43 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/23 12:12:43 INFO data_plane.close: Closing all cached grpc data channels.
19/11/23 12:12:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 12:12:43 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/23 12:12:43 INFO sdk_worker.run: Done consuming work.
19/11/23 12:12:43 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/23 12:12:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/23 12:12:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 12:12:43 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestacYdWF/job_f09e3e27-b73c-433c-90d2-2496aaeef795/MANIFEST
19/11/23 12:12:43 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestacYdWF/job_f09e3e27-b73c-433c-90d2-2496aaeef795/MANIFEST -> 0 artifacts
19/11/23 12:12:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/23 12:12:44 INFO sdk_worker_main.main: Logging handler created.
19/11/23 12:12:44 INFO sdk_worker_main.start: Status HTTP server running at localhost:33271
19/11/23 12:12:44 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/23 12:12:44 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/23 12:12:44 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574511160.64_f5011495-0281-4cc0-829c-4ec2accbbc13', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/23 12:12:44 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574511160.64', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:32799', 'job_port': u'0'}
19/11/23 12:12:44 INFO statecache.__init__: Creating state cache with size 0
19/11/23 12:12:44 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39261.
19/11/23 12:12:44 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/23 12:12:44 INFO sdk_worker.__init__: Control channel established.
19/11/23 12:12:44 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/23 12:12:44 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44109.
19/11/23 12:12:44 INFO sdk_worker.create_state_handler: State channel established.
19/11/23 12:12:44 INFO data_plane.create_data_channel: Creating client data channel for localhost:46789
19/11/23 12:12:44 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/23 12:12:44 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/23 12:12:44 INFO sdk_worker.run: No more requests from control plane
19/11/23 12:12:44 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/23 12:12:44 INFO data_plane.close: Closing all cached grpc data channels.
19/11/23 12:12:44 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/23 12:12:44 INFO sdk_worker.run: Done consuming work.
19/11/23 12:12:44 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/23 12:12:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/23 12:12:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 12:12:44 ERROR org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Failed to handle for unknown endpoint
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.StatusRuntimeException: CANCELLED: cancelled before receiving half close
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.Status.asRuntimeException(Status.java:524)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onCancel(ServerCalls.java:273)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.PartialForwardingServerCallListener.onCancel(PartialForwardingServerCallListener.java:40)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.ForwardingServerCallListener.onCancel(ForwardingServerCallListener.java:23)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.ForwardingServerCallListener$SimpleForwardingServerCallListener.onCancel(ForwardingServerCallListener.java:40)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.Contexts$ContextualizedServerCallListener.onCancel(Contexts.java:96)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.closed(ServerCallImpl.java:337)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1Closed.runInContext(ServerImpl.java:793)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
19/11/23 12:12:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestacYdWF/job_f09e3e27-b73c-433c-90d2-2496aaeef795/MANIFEST
19/11/23 12:12:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestacYdWF/job_f09e3e27-b73c-433c-90d2-2496aaeef795/MANIFEST -> 0 artifacts
19/11/23 12:12:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/23 12:12:45 INFO sdk_worker_main.main: Logging handler created.
19/11/23 12:12:45 INFO sdk_worker_main.start: Status HTTP server running at localhost:34207
19/11/23 12:12:45 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/23 12:12:45 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/23 12:12:45 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574511160.64_f5011495-0281-4cc0-829c-4ec2accbbc13', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/23 12:12:45 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574511160.64', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:32799', 'job_port': u'0'}
19/11/23 12:12:45 INFO statecache.__init__: Creating state cache with size 0
19/11/23 12:12:45 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46001.
19/11/23 12:12:45 INFO sdk_worker.__init__: Control channel established.
19/11/23 12:12:45 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/23 12:12:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/23 12:12:45 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:32793.
19/11/23 12:12:45 INFO sdk_worker.create_state_handler: State channel established.
19/11/23 12:12:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:43565
19/11/23 12:12:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/23 12:12:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/23 12:12:45 INFO sdk_worker.run: No more requests from control plane
19/11/23 12:12:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/23 12:12:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 12:12:45 INFO data_plane.close: Closing all cached grpc data channels.
19/11/23 12:12:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/23 12:12:45 INFO sdk_worker.run: Done consuming work.
19/11/23 12:12:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/23 12:12:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/23 12:12:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 12:12:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestacYdWF/job_f09e3e27-b73c-433c-90d2-2496aaeef795/MANIFEST
19/11/23 12:12:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestacYdWF/job_f09e3e27-b73c-433c-90d2-2496aaeef795/MANIFEST -> 0 artifacts
19/11/23 12:12:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/23 12:12:46 INFO sdk_worker_main.main: Logging handler created.
19/11/23 12:12:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:33863
19/11/23 12:12:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/23 12:12:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/23 12:12:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574511160.64_f5011495-0281-4cc0-829c-4ec2accbbc13', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/23 12:12:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574511160.64', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:32799', 'job_port': u'0'}
19/11/23 12:12:46 INFO statecache.__init__: Creating state cache with size 0
19/11/23 12:12:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43973.
19/11/23 12:12:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/23 12:12:46 INFO sdk_worker.__init__: Control channel established.
19/11/23 12:12:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/23 12:12:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40205.
19/11/23 12:12:46 INFO sdk_worker.create_state_handler: State channel established.
19/11/23 12:12:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:35145
19/11/23 12:12:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/23 12:12:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/23 12:12:46 INFO sdk_worker.run: No more requests from control plane
19/11/23 12:12:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/23 12:12:46 INFO data_plane.close: Closing all cached grpc data channels.
19/11/23 12:12:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 12:12:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/23 12:12:46 INFO sdk_worker.run: Done consuming work.
19/11/23 12:12:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/23 12:12:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/23 12:12:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 12:12:46 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574511160.64_f5011495-0281-4cc0-829c-4ec2accbbc13 finished.
19/11/23 12:12:46 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/23 12:12:46 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestacYdWF/job_f09e3e27-b73c-433c-90d2-2496aaeef795/MANIFEST has 0 artifact locations
19/11/23 12:12:46 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestacYdWF/job_f09e3e27-b73c-433c-90d2-2496aaeef795/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
==================== Timed out after 60 seconds. ====================
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next

# Thread: <Thread(wait_until_finish_read, started daemon 140698809722624)>

    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-119, started daemon 140698801329920)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 140699591124736)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140698784544512)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(lis# Thread: <Thread(Thread-125, started daemon 140698776151808)>

t(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(Thread-119, started daemon 140698801329920)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574511150.5_b05d7cde-17fc-4e70-bcc5-00f1c1e547a3 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <_MainThread(MainThread, started 140699591124736)>

----------------------------------------------------------------------
Ran 38 tests in 329.359s

FAILED (errors=3, skipped=9)
# Thread: <Thread(wait_until_finish_read, started daemon 140698809722624)>

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 8s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/xhsiwifqrpmeg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1610

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1610/display/redirect>

Changes:


------------------------------------------
[...truncated 1.34 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/23 06:13:09 INFO sdk_worker.run: No more requests from control plane
19/11/23 06:13:09 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/23 06:13:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 06:13:09 INFO data_plane.close: Closing all cached grpc data channels.
19/11/23 06:13:09 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/23 06:13:09 INFO sdk_worker.run: Done consuming work.
19/11/23 06:13:09 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/23 06:13:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/23 06:13:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 06:13:09 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest37f3JM/job_61c6dc27-eeda-4144-b628-5408013f7307/MANIFEST
19/11/23 06:13:09 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest37f3JM/job_61c6dc27-eeda-4144-b628-5408013f7307/MANIFEST -> 0 artifacts
19/11/23 06:13:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/23 06:13:10 INFO sdk_worker_main.main: Logging handler created.
19/11/23 06:13:10 INFO sdk_worker_main.start: Status HTTP server running at localhost:34351
19/11/23 06:13:10 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/23 06:13:10 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/23 06:13:10 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574489587.64_8ef0b585-fbe6-4c8c-8051-1f0d46a6740d', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/23 06:13:10 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574489587.64', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:32995', 'job_port': u'0'}
19/11/23 06:13:10 INFO statecache.__init__: Creating state cache with size 0
19/11/23 06:13:10 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45797.
19/11/23 06:13:10 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/23 06:13:10 INFO sdk_worker.__init__: Control channel established.
19/11/23 06:13:10 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/23 06:13:10 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39081.
19/11/23 06:13:10 INFO sdk_worker.create_state_handler: State channel established.
19/11/23 06:13:10 INFO data_plane.create_data_channel: Creating client data channel for localhost:44047
19/11/23 06:13:10 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/23 06:13:10 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/23 06:13:10 INFO sdk_worker.run: No more requests from control plane
19/11/23 06:13:10 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/23 06:13:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 06:13:10 INFO data_plane.close: Closing all cached grpc data channels.
19/11/23 06:13:10 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/23 06:13:10 INFO sdk_worker.run: Done consuming work.
19/11/23 06:13:10 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/23 06:13:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/23 06:13:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 06:13:10 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest37f3JM/job_61c6dc27-eeda-4144-b628-5408013f7307/MANIFEST
19/11/23 06:13:10 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest37f3JM/job_61c6dc27-eeda-4144-b628-5408013f7307/MANIFEST -> 0 artifacts
19/11/23 06:13:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/23 06:13:11 INFO sdk_worker_main.main: Logging handler created.
19/11/23 06:13:11 INFO sdk_worker_main.start: Status HTTP server running at localhost:39523
19/11/23 06:13:11 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/23 06:13:11 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/23 06:13:11 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574489587.64_8ef0b585-fbe6-4c8c-8051-1f0d46a6740d', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/23 06:13:11 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574489587.64', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:32995', 'job_port': u'0'}
19/11/23 06:13:11 INFO statecache.__init__: Creating state cache with size 0
19/11/23 06:13:11 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39425.
19/11/23 06:13:11 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/23 06:13:11 INFO sdk_worker.__init__: Control channel established.
19/11/23 06:13:11 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/23 06:13:11 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39747.
19/11/23 06:13:11 INFO sdk_worker.create_state_handler: State channel established.
19/11/23 06:13:11 INFO data_plane.create_data_channel: Creating client data channel for localhost:40573
19/11/23 06:13:11 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/23 06:13:11 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/23 06:13:11 INFO sdk_worker.run: No more requests from control plane
19/11/23 06:13:11 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/23 06:13:11 INFO data_plane.close: Closing all cached grpc data channels.
19/11/23 06:13:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 06:13:11 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/23 06:13:11 INFO sdk_worker.run: Done consuming work.
19/11/23 06:13:11 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/23 06:13:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/23 06:13:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 06:13:11 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest37f3JM/job_61c6dc27-eeda-4144-b628-5408013f7307/MANIFEST
19/11/23 06:13:11 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest37f3JM/job_61c6dc27-eeda-4144-b628-5408013f7307/MANIFEST -> 0 artifacts
19/11/23 06:13:12 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/23 06:13:12 INFO sdk_worker_main.main: Logging handler created.
19/11/23 06:13:12 INFO sdk_worker_main.start: Status HTTP server running at localhost:33245
19/11/23 06:13:12 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/23 06:13:12 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/23 06:13:12 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574489587.64_8ef0b585-fbe6-4c8c-8051-1f0d46a6740d', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/23 06:13:12 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574489587.64', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:32995', 'job_port': u'0'}
19/11/23 06:13:12 INFO statecache.__init__: Creating state cache with size 0
19/11/23 06:13:12 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33897.
19/11/23 06:13:12 INFO sdk_worker.__init__: Control channel established.
19/11/23 06:13:12 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/23 06:13:12 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/23 06:13:12 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36275.
19/11/23 06:13:12 INFO sdk_worker.create_state_handler: State channel established.
19/11/23 06:13:12 INFO data_plane.create_data_channel: Creating client data channel for localhost:39923
19/11/23 06:13:12 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/23 06:13:12 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/23 06:13:12 INFO sdk_worker.run: No more requests from control plane
19/11/23 06:13:12 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/23 06:13:12 INFO data_plane.close: Closing all cached grpc data channels.
19/11/23 06:13:12 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 06:13:12 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/23 06:13:12 INFO sdk_worker.run: Done consuming work.
19/11/23 06:13:12 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/23 06:13:12 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/23 06:13:12 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 06:13:12 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest37f3JM/job_61c6dc27-eeda-4144-b628-5408013f7307/MANIFEST
19/11/23 06:13:12 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest37f3JM/job_61c6dc27-eeda-4144-b628-5408013f7307/MANIFEST -> 0 artifacts
19/11/23 06:13:13 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/23 06:13:13 INFO sdk_worker_main.main: Logging handler created.
19/11/23 06:13:13 INFO sdk_worker_main.start: Status HTTP server running at localhost:35117
19/11/23 06:13:13 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/23 06:13:13 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/23 06:13:13 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574489587.64_8ef0b585-fbe6-4c8c-8051-1f0d46a6740d', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/23 06:13:13 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574489587.64', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:32995', 'job_port': u'0'}
19/11/23 06:13:13 INFO statecache.__init__: Creating state cache with size 0
19/11/23 06:13:13 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40179.
19/11/23 06:13:13 INFO sdk_worker.__init__: Control channel established.
19/11/23 06:13:13 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/23 06:13:13 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/23 06:13:13 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40285.
19/11/23 06:13:13 INFO sdk_worker.create_state_handler: State channel established.
19/11/23 06:13:13 INFO data_plane.create_data_channel: Creating client data channel for localhost:35763
19/11/23 06:13:13 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/23 06:13:13 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/23 06:13:13 INFO sdk_worker.run: No more requests from control plane
19/11/23 06:13:13 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/23 06:13:13 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 06:13:13 INFO data_plane.close: Closing all cached grpc data channels.
19/11/23 06:13:13 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/23 06:13:13 INFO sdk_worker.run: Done consuming work.
19/11/23 06:13:13 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/23 06:13:13 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/23 06:13:13 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 06:13:13 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574489587.64_8ef0b585-fbe6-4c8c-8051-1f0d46a6740d finished.
19/11/23 06:13:13 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/23 06:13:13 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktest37f3JM/job_61c6dc27-eeda-4144-b628-5408013f7307/MANIFEST has 0 artifact locations
19/11/23 06:13:13 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktest37f3JM/job_61c6dc27-eeda-4144-b628-5408013f7307/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
==================== Timed out after 60 seconds. ====================

    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 139954369840896)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.


# Thread: <Thread(Thread-118, started daemon 139954378233600)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
# Thread: <_MainThread(MainThread, started 139955372893952)>
==================== Timed out after 60 seconds. ====================

----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 139954361448192)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
# Thread: <Thread(Thread-124, started daemon 139954353055488)>

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apach# Thread: <_MainThread(MainThread, started 139955372893952)>

e_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(Thread-118, started daemon 139954378233600)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574489577.39_d45b04cd-a36d-4e69-be16-3958bc957883 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 341.887s

FAILED (errors=3, skipped=9)
# Thread: <Thread(wait_until_finish_read, started daemon 139954369840896)>

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 34s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/zet7jacr57yh2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1609

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1609/display/redirect?page=changes>

Changes:

[tweise] Documenting how to use Beam twitter handle (#10082)


------------------------------------------
[...truncated 1.34 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/23 00:52:40 INFO sdk_worker.run: No more requests from control plane
19/11/23 00:52:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/23 00:52:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 00:52:40 INFO data_plane.close: Closing all cached grpc data channels.
19/11/23 00:52:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/23 00:52:40 INFO sdk_worker.run: Done consuming work.
19/11/23 00:52:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/23 00:52:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/23 00:52:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 00:52:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestNJmif4/job_8f308e34-20a3-496a-8db8-733963294bbe/MANIFEST
19/11/23 00:52:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestNJmif4/job_8f308e34-20a3-496a-8db8-733963294bbe/MANIFEST -> 0 artifacts
19/11/23 00:52:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/23 00:52:41 INFO sdk_worker_main.main: Logging handler created.
19/11/23 00:52:41 INFO sdk_worker_main.start: Status HTTP server running at localhost:45971
19/11/23 00:52:41 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/23 00:52:41 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/23 00:52:41 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574470358.94_76a11315-f68e-40c8-bf4b-5a9ed0d2367a', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/23 00:52:41 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574470358.94', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50671', 'job_port': u'0'}
19/11/23 00:52:41 INFO statecache.__init__: Creating state cache with size 0
19/11/23 00:52:41 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40073.
19/11/23 00:52:41 INFO sdk_worker.__init__: Control channel established.
19/11/23 00:52:41 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/23 00:52:41 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/23 00:52:41 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42027.
19/11/23 00:52:41 INFO sdk_worker.create_state_handler: State channel established.
19/11/23 00:52:41 INFO data_plane.create_data_channel: Creating client data channel for localhost:43793
19/11/23 00:52:41 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/23 00:52:41 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/23 00:52:41 INFO sdk_worker.run: No more requests from control plane
19/11/23 00:52:41 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/23 00:52:41 INFO data_plane.close: Closing all cached grpc data channels.
19/11/23 00:52:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 00:52:41 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/23 00:52:41 INFO sdk_worker.run: Done consuming work.
19/11/23 00:52:41 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/23 00:52:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/23 00:52:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 00:52:41 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestNJmif4/job_8f308e34-20a3-496a-8db8-733963294bbe/MANIFEST
19/11/23 00:52:41 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestNJmif4/job_8f308e34-20a3-496a-8db8-733963294bbe/MANIFEST -> 0 artifacts
19/11/23 00:52:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/23 00:52:42 INFO sdk_worker_main.main: Logging handler created.
19/11/23 00:52:42 INFO sdk_worker_main.start: Status HTTP server running at localhost:44263
19/11/23 00:52:42 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/23 00:52:42 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/23 00:52:42 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574470358.94_76a11315-f68e-40c8-bf4b-5a9ed0d2367a', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/23 00:52:42 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574470358.94', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50671', 'job_port': u'0'}
19/11/23 00:52:42 INFO statecache.__init__: Creating state cache with size 0
19/11/23 00:52:42 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41845.
19/11/23 00:52:42 INFO sdk_worker.__init__: Control channel established.
19/11/23 00:52:42 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/23 00:52:42 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/23 00:52:42 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:32991.
19/11/23 00:52:42 INFO sdk_worker.create_state_handler: State channel established.
19/11/23 00:52:42 INFO data_plane.create_data_channel: Creating client data channel for localhost:36283
19/11/23 00:52:42 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/23 00:52:42 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/23 00:52:42 INFO sdk_worker.run: No more requests from control plane
19/11/23 00:52:42 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/23 00:52:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 00:52:42 INFO data_plane.close: Closing all cached grpc data channels.
19/11/23 00:52:42 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/23 00:52:42 INFO sdk_worker.run: Done consuming work.
19/11/23 00:52:42 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/23 00:52:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/23 00:52:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 00:52:42 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestNJmif4/job_8f308e34-20a3-496a-8db8-733963294bbe/MANIFEST
19/11/23 00:52:42 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestNJmif4/job_8f308e34-20a3-496a-8db8-733963294bbe/MANIFEST -> 0 artifacts
19/11/23 00:52:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/23 00:52:43 INFO sdk_worker_main.main: Logging handler created.
19/11/23 00:52:43 INFO sdk_worker_main.start: Status HTTP server running at localhost:34405
19/11/23 00:52:43 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/23 00:52:43 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/23 00:52:43 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574470358.94_76a11315-f68e-40c8-bf4b-5a9ed0d2367a', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/23 00:52:43 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574470358.94', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50671', 'job_port': u'0'}
19/11/23 00:52:43 INFO statecache.__init__: Creating state cache with size 0
19/11/23 00:52:43 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43161.
19/11/23 00:52:43 INFO sdk_worker.__init__: Control channel established.
19/11/23 00:52:43 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/23 00:52:43 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/23 00:52:43 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42871.
19/11/23 00:52:43 INFO sdk_worker.create_state_handler: State channel established.
19/11/23 00:52:43 INFO data_plane.create_data_channel: Creating client data channel for localhost:35801
19/11/23 00:52:43 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/23 00:52:43 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/23 00:52:43 INFO sdk_worker.run: No more requests from control plane
19/11/23 00:52:43 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/23 00:52:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 00:52:43 INFO data_plane.close: Closing all cached grpc data channels.
19/11/23 00:52:43 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/23 00:52:43 INFO sdk_worker.run: Done consuming work.
19/11/23 00:52:43 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/23 00:52:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/23 00:52:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 00:52:43 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestNJmif4/job_8f308e34-20a3-496a-8db8-733963294bbe/MANIFEST
19/11/23 00:52:43 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestNJmif4/job_8f308e34-20a3-496a-8db8-733963294bbe/MANIFEST -> 0 artifacts
19/11/23 00:52:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/23 00:52:44 INFO sdk_worker_main.main: Logging handler created.
19/11/23 00:52:44 INFO sdk_worker_main.start: Status HTTP server running at localhost:35985
19/11/23 00:52:44 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/23 00:52:44 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/23 00:52:44 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574470358.94_76a11315-f68e-40c8-bf4b-5a9ed0d2367a', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/23 00:52:44 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574470358.94', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50671', 'job_port': u'0'}
19/11/23 00:52:44 INFO statecache.__init__: Creating state cache with size 0
19/11/23 00:52:44 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33813.
19/11/23 00:52:44 INFO sdk_worker.__init__: Control channel established.
19/11/23 00:52:44 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/23 00:52:44 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/23 00:52:44 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41703.
19/11/23 00:52:44 INFO sdk_worker.create_state_handler: State channel established.
19/11/23 00:52:44 INFO data_plane.create_data_channel: Creating client data channel for localhost:42843
19/11/23 00:52:44 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/23 00:52:44 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/23 00:52:44 INFO sdk_worker.run: No more requests from control plane
19/11/23 00:52:44 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/23 00:52:44 INFO data_plane.close: Closing all cached grpc data channels.
19/11/23 00:52:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 00:52:44 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/23 00:52:44 INFO sdk_worker.run: Done consuming work.
19/11/23 00:52:44 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/23 00:52:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/23 00:52:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 00:52:44 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574470358.94_76a11315-f68e-40c8-bf4b-5a9ed0d2367a finished.
19/11/23 00:52:44 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/23 00:52:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestNJmif4/job_8f308e34-20a3-496a-8db8-733963294bbe/MANIFEST has 0 artifact locations
19/11/23 00:52:44 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestNJmif4/job_8f308e34-20a3-496a-8db8-733963294bbe/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140439360050944)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
# Thread: <Thread(Thread-119, started daemon 140439343265536)>

    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 140440139282176)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 140439256561408)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-125, started daemon 140439333299968)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-119, started daemon 140439343265536)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 140439360050944)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apach# Thread: <_MainThread(MainThread, started 140440139282176)>
e_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574470349.89_46268deb-7747-4a4b-9d10-7d87c904b96d failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 305.312s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 30s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/as3hx2fw6pxh4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1608

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1608/display/redirect>

Changes:


------------------------------------------
[...truncated 1.34 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/23 00:17:33 INFO sdk_worker.run: No more requests from control plane
19/11/23 00:17:33 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/23 00:17:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 00:17:33 INFO data_plane.close: Closing all cached grpc data channels.
19/11/23 00:17:33 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/23 00:17:33 INFO sdk_worker.run: Done consuming work.
19/11/23 00:17:33 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/23 00:17:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/23 00:17:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 00:17:33 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestuQa6Ab/job_bb0cc477-cc3f-440b-b3e9-ed79a2726906/MANIFEST
19/11/23 00:17:33 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestuQa6Ab/job_bb0cc477-cc3f-440b-b3e9-ed79a2726906/MANIFEST -> 0 artifacts
19/11/23 00:17:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/23 00:17:34 INFO sdk_worker_main.main: Logging handler created.
19/11/23 00:17:34 INFO sdk_worker_main.start: Status HTTP server running at localhost:39923
19/11/23 00:17:34 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/23 00:17:34 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/23 00:17:34 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574468251.9_605d0e07-26c8-4f05-b870-0b4b3042fae8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/23 00:17:34 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574468251.9', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48889', 'job_port': u'0'}
19/11/23 00:17:34 INFO statecache.__init__: Creating state cache with size 0
19/11/23 00:17:34 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38377.
19/11/23 00:17:34 INFO sdk_worker.__init__: Control channel established.
19/11/23 00:17:34 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/23 00:17:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/23 00:17:34 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43659.
19/11/23 00:17:34 INFO sdk_worker.create_state_handler: State channel established.
19/11/23 00:17:34 INFO data_plane.create_data_channel: Creating client data channel for localhost:33999
19/11/23 00:17:34 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/23 00:17:34 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/23 00:17:34 INFO sdk_worker.run: No more requests from control plane
19/11/23 00:17:34 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/23 00:17:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 00:17:34 INFO data_plane.close: Closing all cached grpc data channels.
19/11/23 00:17:34 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/23 00:17:34 INFO sdk_worker.run: Done consuming work.
19/11/23 00:17:34 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/23 00:17:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/23 00:17:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 00:17:34 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestuQa6Ab/job_bb0cc477-cc3f-440b-b3e9-ed79a2726906/MANIFEST
19/11/23 00:17:34 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestuQa6Ab/job_bb0cc477-cc3f-440b-b3e9-ed79a2726906/MANIFEST -> 0 artifacts
19/11/23 00:17:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/23 00:17:35 INFO sdk_worker_main.main: Logging handler created.
19/11/23 00:17:35 INFO sdk_worker_main.start: Status HTTP server running at localhost:35593
19/11/23 00:17:35 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/23 00:17:35 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/23 00:17:35 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574468251.9_605d0e07-26c8-4f05-b870-0b4b3042fae8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/23 00:17:35 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574468251.9', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48889', 'job_port': u'0'}
19/11/23 00:17:35 INFO statecache.__init__: Creating state cache with size 0
19/11/23 00:17:35 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46007.
19/11/23 00:17:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/23 00:17:35 INFO sdk_worker.__init__: Control channel established.
19/11/23 00:17:35 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/23 00:17:35 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44045.
19/11/23 00:17:35 INFO sdk_worker.create_state_handler: State channel established.
19/11/23 00:17:35 INFO data_plane.create_data_channel: Creating client data channel for localhost:39957
19/11/23 00:17:35 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/23 00:17:35 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/23 00:17:35 INFO sdk_worker.run: No more requests from control plane
19/11/23 00:17:35 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/23 00:17:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 00:17:35 INFO data_plane.close: Closing all cached grpc data channels.
19/11/23 00:17:35 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/23 00:17:35 INFO sdk_worker.run: Done consuming work.
19/11/23 00:17:35 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/23 00:17:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/23 00:17:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 00:17:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestuQa6Ab/job_bb0cc477-cc3f-440b-b3e9-ed79a2726906/MANIFEST
19/11/23 00:17:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestuQa6Ab/job_bb0cc477-cc3f-440b-b3e9-ed79a2726906/MANIFEST -> 0 artifacts
19/11/23 00:17:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/23 00:17:36 INFO sdk_worker_main.main: Logging handler created.
19/11/23 00:17:36 INFO sdk_worker_main.start: Status HTTP server running at localhost:40301
19/11/23 00:17:36 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/23 00:17:36 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/23 00:17:36 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574468251.9_605d0e07-26c8-4f05-b870-0b4b3042fae8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/23 00:17:36 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574468251.9', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48889', 'job_port': u'0'}
19/11/23 00:17:36 INFO statecache.__init__: Creating state cache with size 0
19/11/23 00:17:36 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44809.
19/11/23 00:17:36 INFO sdk_worker.__init__: Control channel established.
19/11/23 00:17:36 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/23 00:17:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/23 00:17:36 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36643.
19/11/23 00:17:36 INFO sdk_worker.create_state_handler: State channel established.
19/11/23 00:17:36 INFO data_plane.create_data_channel: Creating client data channel for localhost:44855
19/11/23 00:17:36 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/23 00:17:36 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/23 00:17:36 INFO sdk_worker.run: No more requests from control plane
19/11/23 00:17:36 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/23 00:17:36 INFO data_plane.close: Closing all cached grpc data channels.
19/11/23 00:17:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 00:17:36 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/23 00:17:36 INFO sdk_worker.run: Done consuming work.
19/11/23 00:17:36 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/23 00:17:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/23 00:17:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 00:17:36 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestuQa6Ab/job_bb0cc477-cc3f-440b-b3e9-ed79a2726906/MANIFEST
19/11/23 00:17:36 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestuQa6Ab/job_bb0cc477-cc3f-440b-b3e9-ed79a2726906/MANIFEST -> 0 artifacts
19/11/23 00:17:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/23 00:17:37 INFO sdk_worker_main.main: Logging handler created.
19/11/23 00:17:37 INFO sdk_worker_main.start: Status HTTP server running at localhost:42115
19/11/23 00:17:37 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/23 00:17:37 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/23 00:17:37 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574468251.9_605d0e07-26c8-4f05-b870-0b4b3042fae8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/23 00:17:37 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574468251.9', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48889', 'job_port': u'0'}
19/11/23 00:17:37 INFO statecache.__init__: Creating state cache with size 0
19/11/23 00:17:37 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34993.
19/11/23 00:17:37 INFO sdk_worker.__init__: Control channel established.
19/11/23 00:17:37 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/23 00:17:37 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/23 00:17:37 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39289.
19/11/23 00:17:37 INFO sdk_worker.create_state_handler: State channel established.
19/11/23 00:17:37 INFO data_plane.create_data_channel: Creating client data channel for localhost:40007
19/11/23 00:17:37 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/23 00:17:37 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/23 00:17:37 INFO sdk_worker.run: No more requests from control plane
19/11/23 00:17:37 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/23 00:17:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 00:17:37 INFO data_plane.close: Closing all cached grpc data channels.
19/11/23 00:17:37 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/23 00:17:37 INFO sdk_worker.run: Done consuming work.
19/11/23 00:17:37 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/23 00:17:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/23 00:17:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/23 00:17:37 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574468251.9_605d0e07-26c8-4f05-b870-0b4b3042fae8 finished.
19/11/23 00:17:37 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/23 00:17:37 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestuQa6Ab/job_bb0cc477-cc3f-440b-b3e9-ed79a2726906/MANIFEST has 0 artifact locations
19/11/23 00:17:37 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestuQa6Ab/job_bb0cc477-cc3f-440b-b3e9-ed79a2726906/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
==================== Timed out after 60 seconds. ====================
  File "/usr/lib/python2.7/threading.py", line 359, in wait

    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
# Thread: <Thread(wait_until_finish_read, started daemon 140247983400704)>

# Thread: <Thread(Thread-120, started daemon 140247703836416)>

# Thread: <_MainThread(MainThread, started 140248498792192)>
BaseException: Timed out after 60 seconds.

==================== Timed out after 60 seconds. ====================

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
# Thread: <Thread(wait_until_finish_read, started daemon 140247684691712)>

----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-126, started daemon 140247693084416)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
# Thread: <Thread(Thread-120, started daemon 140247703836416)>

    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_ru# Thread: <Thread(wait_until_finish_read, started daemon 140247983400704)>

nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 140248498792192)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574468242.07_7fb0b1a1-9c3f-4940-bc95-0dd90418de19 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 321.517s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 1s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://gradle.com/s/ucs7qxsjxuwl2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1607

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1607/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-8805] Remove obsolete worker_threads experiment in tests

[sunjincheng121] [BEAM-8619] Move reusable information to BundleProcessor.

[sunjincheng121] [BEAM-8619] Extract tearDown functions into BundleProcessor.

[sunjincheng121] [BEAM-8619] Reuse the BundleProcessor between bundles for the same

[sunjincheng121] [BEAM-8619] Teardown the DoFns when upon control service termination for


------------------------------------------
[...truncated 1.33 MB...]
19/11/22 22:21:52 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 22:21:52 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowed_pardo_state_timers_1574461306.78_c7db8386-241b-4b96-abef-6f35b5bd8025 finished.
19/11/22 22:21:52 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/22 22:21:52 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestddQ_E2/job_ecb4f3e9-197d-44ba-a9df-8fd10fdfd8f0/MANIFEST has 0 artifact locations
19/11/22 22:21:52 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestddQ_E2/job_ecb4f3e9-197d-44ba-a9df-8fd10fdfd8f0/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function lift_combiners at 0x7fb21e10b230> ====================
19/11/22 22:21:53 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job test_windowing_1574461312.11_d2ebcf47-ee0d-4d40-8685-73df1ab07fe7
19/11/22 22:21:53 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation test_windowing_1574461312.11_d2ebcf47-ee0d-4d40-8685-73df1ab07fe7
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/11/22 22:21:53 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/11/22 22:21:53 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/11/22 22:21:53 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1574461312.11_d2ebcf47-ee0d-4d40-8685-73df1ab07fe7 on Spark master local
19/11/22 22:21:53 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/11/22 22:21:53 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574461312.11_d2ebcf47-ee0d-4d40-8685-73df1ab07fe7: Pipeline translated successfully. Computing outputs
19/11/22 22:21:53 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestddQ_E2/job_6a9ea630-cd94-46db-a981-a608934eb3a6/MANIFEST
19/11/22 22:21:53 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestddQ_E2/job_6a9ea630-cd94-46db-a981-a608934eb3a6/MANIFEST has 0 artifact locations
19/11/22 22:21:53 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestddQ_E2/job_6a9ea630-cd94-46db-a981-a608934eb3a6/MANIFEST -> 0 artifacts
19/11/22 22:21:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 22:21:53 INFO sdk_worker_main.main: Logging handler created.
19/11/22 22:21:53 INFO sdk_worker_main.start: Status HTTP server running at localhost:43491
19/11/22 22:21:53 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 22:21:53 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 22:21:53 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574461312.11_d2ebcf47-ee0d-4d40-8685-73df1ab07fe7', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/22 22:21:53 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574461312.11', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54689', 'job_port': u'0'}
19/11/22 22:21:53 INFO statecache.__init__: Creating state cache with size 0
19/11/22 22:21:53 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40077.
19/11/22 22:21:53 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/11/22 22:21:53 INFO sdk_worker.__init__: Control channel established.
19/11/22 22:21:53 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 22:21:53 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34885.
19/11/22 22:21:53 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 22:21:53 INFO data_plane.create_data_channel: Creating client data channel for localhost:42263
19/11/22 22:21:53 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 22:21:53 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 22:21:53 INFO sdk_worker.run: No more requests from control plane
19/11/22 22:21:53 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 22:21:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 22:21:53 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 22:21:53 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 22:21:53 INFO sdk_worker.run: Done consuming work.
19/11/22 22:21:53 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 22:21:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 22:21:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 22:21:53 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestddQ_E2/job_6a9ea630-cd94-46db-a981-a608934eb3a6/MANIFEST
19/11/22 22:21:53 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestddQ_E2/job_6a9ea630-cd94-46db-a981-a608934eb3a6/MANIFEST -> 0 artifacts
19/11/22 22:21:54 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 22:21:54 INFO sdk_worker_main.main: Logging handler created.
19/11/22 22:21:54 INFO sdk_worker_main.start: Status HTTP server running at localhost:34193
19/11/22 22:21:54 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 22:21:54 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 22:21:54 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574461312.11_d2ebcf47-ee0d-4d40-8685-73df1ab07fe7', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/22 22:21:54 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574461312.11', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54689', 'job_port': u'0'}
19/11/22 22:21:54 INFO statecache.__init__: Creating state cache with size 0
19/11/22 22:21:54 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46713.
19/11/22 22:21:54 INFO sdk_worker.__init__: Control channel established.
19/11/22 22:21:54 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 22:21:54 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/22 22:21:54 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36591.
19/11/22 22:21:54 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 22:21:54 INFO data_plane.create_data_channel: Creating client data channel for localhost:41991
19/11/22 22:21:54 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 22:21:54 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 22:21:54 INFO sdk_worker.run: No more requests from control plane
19/11/22 22:21:54 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 22:21:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 22:21:54 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 22:21:54 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 22:21:54 INFO sdk_worker.run: Done consuming work.
19/11/22 22:21:54 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 22:21:54 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 22:21:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 22:21:54 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestddQ_E2/job_6a9ea630-cd94-46db-a981-a608934eb3a6/MANIFEST
19/11/22 22:21:54 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestddQ_E2/job_6a9ea630-cd94-46db-a981-a608934eb3a6/MANIFEST -> 0 artifacts
19/11/22 22:21:55 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 22:21:55 INFO sdk_worker_main.main: Logging handler created.
19/11/22 22:21:55 INFO sdk_worker_main.start: Status HTTP server running at localhost:43447
19/11/22 22:21:55 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 22:21:55 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 22:21:55 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574461312.11_d2ebcf47-ee0d-4d40-8685-73df1ab07fe7', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/22 22:21:55 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574461312.11', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54689', 'job_port': u'0'}
19/11/22 22:21:55 INFO statecache.__init__: Creating state cache with size 0
19/11/22 22:21:55 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39487.
19/11/22 22:21:55 INFO sdk_worker.__init__: Control channel established.
19/11/22 22:21:55 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 22:21:55 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/22 22:21:55 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38753.
19/11/22 22:21:55 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 22:21:55 INFO data_plane.create_data_channel: Creating client data channel for localhost:41063
19/11/22 22:21:55 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 22:21:55 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 22:21:55 INFO sdk_worker.run: No more requests from control plane
19/11/22 22:21:55 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 22:21:55 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 22:21:55 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 22:21:55 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 22:21:55 INFO sdk_worker.run: Done consuming work.
19/11/22 22:21:55 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 22:21:55 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 22:21:55 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 22:21:55 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestddQ_E2/job_6a9ea630-cd94-46db-a981-a608934eb3a6/MANIFEST
19/11/22 22:21:55 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestddQ_E2/job_6a9ea630-cd94-46db-a981-a608934eb3a6/MANIFEST -> 0 artifacts
19/11/22 22:21:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 22:21:56 INFO sdk_worker_main.main: Logging handler created.
19/11/22 22:21:56 INFO sdk_worker_main.start: Status HTTP server running at localhost:34351
19/11/22 22:21:56 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 22:21:56 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 22:21:56 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574461312.11_d2ebcf47-ee0d-4d40-8685-73df1ab07fe7', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/22 22:21:56 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574461312.11', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54689', 'job_port': u'0'}
19/11/22 22:21:56 INFO statecache.__init__: Creating state cache with size 0
19/11/22 22:21:56 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45669.
19/11/22 22:21:56 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/22 22:21:56 INFO sdk_worker.__init__: Control channel established.
19/11/22 22:21:56 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 22:21:56 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36727.
19/11/22 22:21:56 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 22:21:56 INFO data_plane.create_data_channel: Creating client data channel for localhost:37679
19/11/22 22:21:56 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 22:21:56 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 22:21:56 INFO sdk_worker.run: No more requests from control plane
19/11/22 22:21:56 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 22:21:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 22:21:56 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 22:21:56 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 22:21:56 INFO sdk_worker.run: Done consuming work.
19/11/22 22:21:56 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 22:21:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 22:21:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 22:21:56 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestddQ_E2/job_6a9ea630-cd94-46db-a981-a608934eb3a6/MANIFEST
19/11/22 22:21:56 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestddQ_E2/job_6a9ea630-cd94-46db-a981-a608934eb3a6/MANIFEST -> 0 artifacts
19/11/22 22:21:57 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 22:21:57 INFO sdk_worker_main.main: Logging handler created.
19/11/22 22:21:57 INFO sdk_worker_main.start: Status HTTP server running at localhost:42201
19/11/22 22:21:57 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 22:21:57 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 22:21:57 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574461312.11_d2ebcf47-ee0d-4d40-8685-73df1ab07fe7', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/22 22:21:57 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574461312.11', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54689', 'job_port': u'0'}
19/11/22 22:21:57 INFO statecache.__init__: Creating state cache with size 0
19/11/22 22:21:57 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41069.
19/11/22 22:21:57 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/22 22:21:57 INFO sdk_worker.__init__: Control channel established.
19/11/22 22:21:57 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 22:21:57 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44881.
19/11/22 22:21:57 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 22:21:57 INFO data_plane.create_data_channel: Creating client data channel for localhost:42787
19/11/22 22:21:57 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 22:21:57 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 22:21:57 INFO sdk_worker.run: No more requests from control plane
19/11/22 22:21:57 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 22:21:57 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 22:21:57 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 22:21:57 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 22:21:57 INFO sdk_worker.run: Done consuming work.
19/11/22 22:21:57 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 22:21:57 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 22:21:57 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 22:21:57 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574461312.11_d2ebcf47-ee0d-4d40-8685-73df1ab07fe7 finished.
19/11/22 22:21:57 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/22 22:21:57 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestddQ_E2/job_6a9ea630-cd94-46db-a981-a608934eb3a6/MANIFEST has 0 artifact locations
19/11/22 22:21:57 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestddQ_E2/job_6a9ea630-cd94-46db-a981-a608934eb3a6/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
==================== Timed out after 60 seconds. ====================

Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 140402512164608)>

# Thread: <Thread(Thread-117, started daemon 140402520557312)>

  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <_MainThread(MainThread, started 140403299788544)>
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574461303.39_40fc0972-f7f8-413a-b2f2-c45269118eb6 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 283.689s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 32s
60 actionable tasks: 59 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/h4jfsne4magzq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1606

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1606/display/redirect?page=changes>

Changes:

[robertwb] [BEAM-8802] Don't clear watermark hold when adding elements.


------------------------------------------
[...truncated 1.34 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 20:32:47 INFO sdk_worker.run: No more requests from control plane
19/11/22 20:32:47 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 20:32:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 20:32:47 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 20:32:47 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 20:32:47 INFO sdk_worker.run: Done consuming work.
19/11/22 20:32:47 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 20:32:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 20:32:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 20:32:47 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestrddZNv/job_13cffa30-e3cd-4d32-b843-46763ca8ee7d/MANIFEST
19/11/22 20:32:47 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestrddZNv/job_13cffa30-e3cd-4d32-b843-46763ca8ee7d/MANIFEST -> 0 artifacts
19/11/22 20:32:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 20:32:48 INFO sdk_worker_main.main: Logging handler created.
19/11/22 20:32:48 INFO sdk_worker_main.start: Status HTTP server running at localhost:36147
19/11/22 20:32:48 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 20:32:48 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 20:32:48 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574454765.61_2babbb90-400c-4246-8b30-53e74580afe6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/22 20:32:48 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574454765.61', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49473', 'job_port': u'0'}
19/11/22 20:32:48 INFO statecache.__init__: Creating state cache with size 0
19/11/22 20:32:48 INFO sdk_worker.__init__: Creating insecure control channel for localhost:32931.
19/11/22 20:32:48 INFO sdk_worker.__init__: Control channel established.
19/11/22 20:32:48 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/22 20:32:48 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 20:32:48 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41771.
19/11/22 20:32:48 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 20:32:48 INFO data_plane.create_data_channel: Creating client data channel for localhost:39975
19/11/22 20:32:48 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 20:32:48 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 20:32:48 INFO sdk_worker.run: No more requests from control plane
19/11/22 20:32:48 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 20:32:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 20:32:48 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 20:32:48 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 20:32:48 INFO sdk_worker.run: Done consuming work.
19/11/22 20:32:48 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 20:32:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 20:32:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 20:32:48 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestrddZNv/job_13cffa30-e3cd-4d32-b843-46763ca8ee7d/MANIFEST
19/11/22 20:32:48 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestrddZNv/job_13cffa30-e3cd-4d32-b843-46763ca8ee7d/MANIFEST -> 0 artifacts
19/11/22 20:32:49 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 20:32:49 INFO sdk_worker_main.main: Logging handler created.
19/11/22 20:32:49 INFO sdk_worker_main.start: Status HTTP server running at localhost:38857
19/11/22 20:32:49 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 20:32:49 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 20:32:49 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574454765.61_2babbb90-400c-4246-8b30-53e74580afe6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/22 20:32:49 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574454765.61', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49473', 'job_port': u'0'}
19/11/22 20:32:49 INFO statecache.__init__: Creating state cache with size 0
19/11/22 20:32:49 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39497.
19/11/22 20:32:49 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/22 20:32:49 INFO sdk_worker.__init__: Control channel established.
19/11/22 20:32:49 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 20:32:49 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37345.
19/11/22 20:32:49 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 20:32:49 INFO data_plane.create_data_channel: Creating client data channel for localhost:35571
19/11/22 20:32:49 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 20:32:49 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 20:32:49 INFO sdk_worker.run: No more requests from control plane
19/11/22 20:32:49 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 20:32:49 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 20:32:49 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 20:32:49 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 20:32:49 INFO sdk_worker.run: Done consuming work.
19/11/22 20:32:49 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 20:32:49 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 20:32:49 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 20:32:49 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestrddZNv/job_13cffa30-e3cd-4d32-b843-46763ca8ee7d/MANIFEST
19/11/22 20:32:49 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestrddZNv/job_13cffa30-e3cd-4d32-b843-46763ca8ee7d/MANIFEST -> 0 artifacts
19/11/22 20:32:50 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 20:32:50 INFO sdk_worker_main.main: Logging handler created.
19/11/22 20:32:50 INFO sdk_worker_main.start: Status HTTP server running at localhost:41891
19/11/22 20:32:50 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 20:32:50 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 20:32:50 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574454765.61_2babbb90-400c-4246-8b30-53e74580afe6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/22 20:32:50 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574454765.61', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49473', 'job_port': u'0'}
19/11/22 20:32:50 INFO statecache.__init__: Creating state cache with size 0
19/11/22 20:32:50 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34829.
19/11/22 20:32:50 INFO sdk_worker.__init__: Control channel established.
19/11/22 20:32:50 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 20:32:50 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/22 20:32:50 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45879.
19/11/22 20:32:50 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 20:32:50 INFO data_plane.create_data_channel: Creating client data channel for localhost:35047
19/11/22 20:32:50 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 20:32:50 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 20:32:50 INFO sdk_worker.run: No more requests from control plane
19/11/22 20:32:50 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 20:32:50 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 20:32:50 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 20:32:50 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 20:32:50 INFO sdk_worker.run: Done consuming work.
19/11/22 20:32:50 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 20:32:50 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 20:32:50 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 20:32:50 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestrddZNv/job_13cffa30-e3cd-4d32-b843-46763ca8ee7d/MANIFEST
19/11/22 20:32:50 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestrddZNv/job_13cffa30-e3cd-4d32-b843-46763ca8ee7d/MANIFEST -> 0 artifacts
19/11/22 20:32:51 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 20:32:51 INFO sdk_worker_main.main: Logging handler created.
19/11/22 20:32:51 INFO sdk_worker_main.start: Status HTTP server running at localhost:46869
19/11/22 20:32:51 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 20:32:51 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 20:32:51 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574454765.61_2babbb90-400c-4246-8b30-53e74580afe6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/22 20:32:51 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574454765.61', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49473', 'job_port': u'0'}
19/11/22 20:32:51 INFO statecache.__init__: Creating state cache with size 0
19/11/22 20:32:51 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37343.
19/11/22 20:32:51 INFO sdk_worker.__init__: Control channel established.
19/11/22 20:32:51 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/22 20:32:51 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 20:32:51 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42147.
19/11/22 20:32:51 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 20:32:51 INFO data_plane.create_data_channel: Creating client data channel for localhost:39403
19/11/22 20:32:51 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 20:32:51 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 20:32:51 INFO sdk_worker.run: No more requests from control plane
19/11/22 20:32:51 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 20:32:51 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 20:32:51 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 20:32:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 20:32:51 INFO sdk_worker.run: Done consuming work.
19/11/22 20:32:51 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 20:32:51 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 20:32:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 20:32:51 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574454765.61_2babbb90-400c-4246-8b30-53e74580afe6 finished.
19/11/22 20:32:51 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/22 20:32:51 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestrddZNv/job_13cffa30-e3cd-4d32-b843-46763ca8ee7d/MANIFEST has 0 artifact locations
19/11/22 20:32:51 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestrddZNv/job_13cffa30-e3cd-4d32-b843-46763ca8ee7d/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140115774818048)>

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_ru# Thread: <Thread(Thread-120, started daemon 140115783210752)>

# Thread: <_MainThread(MainThread, started 140116562441984)>
nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 140115748067072)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-124, started daemon 140115756459776)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 140116562441984)>

# Thread: <Thread(Thread-120, started daemon 140115783210752)>

# Thread: <Thread(wait_until_finish_read, started daemon 140115774818048)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574454755.73_6083f3f8-7a9b-4a5d-ba88-e4e3799a3c1f failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 351.562s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 53s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://gradle.com/s/cu3nmpfttqmz6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1605

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1605/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-8658] [BEAM-8781] Optionally set jar and artifact staging port in

[kcweaver] Pass artifact port to FlinkJarJobServer as well.

[kcweaver] Move FlinkRunnerOptions to pipeline_options.

[kcweaver] [BEAM-8796] Optionally configure static ports for job and expansion

[lukecwik] [BEAM-7948] Add time-based cache threshold support in the Java data s…


------------------------------------------
[...truncated 1.34 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 19:13:27 INFO sdk_worker.run: No more requests from control plane
19/11/22 19:13:27 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 19:13:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 19:13:27 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 19:13:27 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 19:13:27 INFO sdk_worker.run: Done consuming work.
19/11/22 19:13:27 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 19:13:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 19:13:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 19:13:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestAZojHj/job_6c9cc2ea-311d-4cde-aaa2-64db9ef1d7a0/MANIFEST
19/11/22 19:13:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestAZojHj/job_6c9cc2ea-311d-4cde-aaa2-64db9ef1d7a0/MANIFEST -> 0 artifacts
19/11/22 19:13:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 19:13:27 INFO sdk_worker_main.main: Logging handler created.
19/11/22 19:13:27 INFO sdk_worker_main.start: Status HTTP server running at localhost:42075
19/11/22 19:13:27 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 19:13:27 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 19:13:27 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574450005.26_a22aadcc-7dab-4943-8221-ac8a8183ce25', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/22 19:13:27 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574450005.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48213', 'job_port': u'0'}
19/11/22 19:13:27 INFO statecache.__init__: Creating state cache with size 0
19/11/22 19:13:27 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35609.
19/11/22 19:13:27 INFO sdk_worker.__init__: Control channel established.
19/11/22 19:13:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/22 19:13:27 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 19:13:27 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40611.
19/11/22 19:13:27 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 19:13:27 INFO data_plane.create_data_channel: Creating client data channel for localhost:45389
19/11/22 19:13:27 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 19:13:28 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 19:13:28 INFO sdk_worker.run: No more requests from control plane
19/11/22 19:13:28 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 19:13:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 19:13:28 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 19:13:28 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 19:13:28 INFO sdk_worker.run: Done consuming work.
19/11/22 19:13:28 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 19:13:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 19:13:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 19:13:28 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestAZojHj/job_6c9cc2ea-311d-4cde-aaa2-64db9ef1d7a0/MANIFEST
19/11/22 19:13:28 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestAZojHj/job_6c9cc2ea-311d-4cde-aaa2-64db9ef1d7a0/MANIFEST -> 0 artifacts
19/11/22 19:13:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 19:13:28 INFO sdk_worker_main.main: Logging handler created.
19/11/22 19:13:28 INFO sdk_worker_main.start: Status HTTP server running at localhost:45667
19/11/22 19:13:28 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 19:13:28 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 19:13:28 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574450005.26_a22aadcc-7dab-4943-8221-ac8a8183ce25', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/22 19:13:28 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574450005.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48213', 'job_port': u'0'}
19/11/22 19:13:28 INFO statecache.__init__: Creating state cache with size 0
19/11/22 19:13:28 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43463.
19/11/22 19:13:28 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/22 19:13:28 INFO sdk_worker.__init__: Control channel established.
19/11/22 19:13:28 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 19:13:28 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39883.
19/11/22 19:13:28 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 19:13:28 INFO data_plane.create_data_channel: Creating client data channel for localhost:34295
19/11/22 19:13:28 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 19:13:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 19:13:29 INFO sdk_worker.run: No more requests from control plane
19/11/22 19:13:29 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 19:13:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 19:13:29 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 19:13:29 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 19:13:29 INFO sdk_worker.run: Done consuming work.
19/11/22 19:13:29 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 19:13:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 19:13:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 19:13:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestAZojHj/job_6c9cc2ea-311d-4cde-aaa2-64db9ef1d7a0/MANIFEST
19/11/22 19:13:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestAZojHj/job_6c9cc2ea-311d-4cde-aaa2-64db9ef1d7a0/MANIFEST -> 0 artifacts
19/11/22 19:13:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 19:13:29 INFO sdk_worker_main.main: Logging handler created.
19/11/22 19:13:29 INFO sdk_worker_main.start: Status HTTP server running at localhost:38431
19/11/22 19:13:29 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 19:13:29 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 19:13:29 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574450005.26_a22aadcc-7dab-4943-8221-ac8a8183ce25', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/22 19:13:29 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574450005.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48213', 'job_port': u'0'}
19/11/22 19:13:29 INFO statecache.__init__: Creating state cache with size 0
19/11/22 19:13:29 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42835.
19/11/22 19:13:29 INFO sdk_worker.__init__: Control channel established.
19/11/22 19:13:29 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 19:13:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/22 19:13:29 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39903.
19/11/22 19:13:29 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 19:13:29 INFO data_plane.create_data_channel: Creating client data channel for localhost:36785
19/11/22 19:13:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 19:13:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 19:13:29 INFO sdk_worker.run: No more requests from control plane
19/11/22 19:13:29 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 19:13:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 19:13:29 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 19:13:29 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 19:13:29 INFO sdk_worker.run: Done consuming work.
19/11/22 19:13:29 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 19:13:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 19:13:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 19:13:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestAZojHj/job_6c9cc2ea-311d-4cde-aaa2-64db9ef1d7a0/MANIFEST
19/11/22 19:13:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestAZojHj/job_6c9cc2ea-311d-4cde-aaa2-64db9ef1d7a0/MANIFEST -> 0 artifacts
19/11/22 19:13:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 19:13:30 INFO sdk_worker_main.main: Logging handler created.
19/11/22 19:13:30 INFO sdk_worker_main.start: Status HTTP server running at localhost:37553
19/11/22 19:13:30 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 19:13:30 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 19:13:30 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574450005.26_a22aadcc-7dab-4943-8221-ac8a8183ce25', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/22 19:13:30 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574450005.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48213', 'job_port': u'0'}
19/11/22 19:13:30 INFO statecache.__init__: Creating state cache with size 0
19/11/22 19:13:30 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40827.
19/11/22 19:13:30 INFO sdk_worker.__init__: Control channel established.
19/11/22 19:13:30 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 19:13:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/22 19:13:30 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34529.
19/11/22 19:13:30 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 19:13:30 INFO data_plane.create_data_channel: Creating client data channel for localhost:35361
19/11/22 19:13:30 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 19:13:30 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 19:13:30 INFO sdk_worker.run: No more requests from control plane
19/11/22 19:13:30 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 19:13:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 19:13:30 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 19:13:30 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 19:13:30 INFO sdk_worker.run: Done consuming work.
19/11/22 19:13:30 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 19:13:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 19:13:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 19:13:30 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574450005.26_a22aadcc-7dab-4943-8221-ac8a8183ce25 finished.
19/11/22 19:13:30 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/22 19:13:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestAZojHj/job_6c9cc2ea-311d-4cde-aaa2-64db9ef1d7a0/MANIFEST has 0 artifact locations
19/11/22 19:13:30 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestAZojHj/job_6c9cc2ea-311d-4cde-aaa2-64db9ef1d7a0/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139755593000704)>

# Thread: <Thread(Thread-116, started daemon 139755601393408)>

# Thread: <_MainThread(MainThread, started 139756389017344)>
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_ru==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139755574642432)>

# Thread: <Thread(Thread-122, started daemon 139755583035136)>

# Thread: <Thread(Thread-116, started daemon 139755601393408)>

# Thread: <_MainThread(MainThread, started 139756389017344)>

# Thread: <Thread(wait_until_finish_read, started daemon 139755593000704)>
nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574449995.78_3792a144-0e2e-471b-b11f-2767c55e6b98 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 308.320s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 44s
60 actionable tasks: 52 executed, 8 from cache

Publishing build scan...
https://gradle.com/s/guftplwjvya62

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1604

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1604/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 18:15:34 INFO sdk_worker.run: No more requests from control plane
19/11/22 18:15:34 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 18:15:34 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 18:15:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 18:15:34 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 18:15:34 INFO sdk_worker.run: Done consuming work.
19/11/22 18:15:34 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 18:15:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 18:15:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 18:15:34 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestf_B5Wz/job_dd3aa8f9-260e-43a4-bcec-741c0bd5a7de/MANIFEST
19/11/22 18:15:34 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestf_B5Wz/job_dd3aa8f9-260e-43a4-bcec-741c0bd5a7de/MANIFEST -> 0 artifacts
19/11/22 18:15:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 18:15:35 INFO sdk_worker_main.main: Logging handler created.
19/11/22 18:15:35 INFO sdk_worker_main.start: Status HTTP server running at localhost:40805
19/11/22 18:15:35 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 18:15:35 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 18:15:35 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574446532.71_af5f0539-5805-4a48-bdd7-f7bfa0a7cce8', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/22 18:15:35 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574446532.71', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53241'}
19/11/22 18:15:35 INFO statecache.__init__: Creating state cache with size 0
19/11/22 18:15:35 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44539.
19/11/22 18:15:35 INFO sdk_worker.__init__: Control channel established.
19/11/22 18:15:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/22 18:15:35 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 18:15:35 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:32821.
19/11/22 18:15:35 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 18:15:35 INFO data_plane.create_data_channel: Creating client data channel for localhost:41199
19/11/22 18:15:35 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 18:15:35 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 18:15:35 INFO sdk_worker.run: No more requests from control plane
19/11/22 18:15:35 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 18:15:35 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 18:15:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 18:15:35 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 18:15:35 INFO sdk_worker.run: Done consuming work.
19/11/22 18:15:35 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 18:15:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 18:15:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 18:15:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestf_B5Wz/job_dd3aa8f9-260e-43a4-bcec-741c0bd5a7de/MANIFEST
19/11/22 18:15:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestf_B5Wz/job_dd3aa8f9-260e-43a4-bcec-741c0bd5a7de/MANIFEST -> 0 artifacts
19/11/22 18:15:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 18:15:36 INFO sdk_worker_main.main: Logging handler created.
19/11/22 18:15:36 INFO sdk_worker_main.start: Status HTTP server running at localhost:33215
19/11/22 18:15:36 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 18:15:36 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 18:15:36 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574446532.71_af5f0539-5805-4a48-bdd7-f7bfa0a7cce8', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/22 18:15:36 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574446532.71', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53241'}
19/11/22 18:15:36 INFO statecache.__init__: Creating state cache with size 0
19/11/22 18:15:36 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44103.
19/11/22 18:15:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/22 18:15:36 INFO sdk_worker.__init__: Control channel established.
19/11/22 18:15:36 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 18:15:36 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43405.
19/11/22 18:15:36 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 18:15:36 INFO data_plane.create_data_channel: Creating client data channel for localhost:41647
19/11/22 18:15:36 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 18:15:36 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 18:15:36 INFO sdk_worker.run: No more requests from control plane
19/11/22 18:15:36 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 18:15:36 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 18:15:36 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 18:15:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 18:15:36 INFO sdk_worker.run: Done consuming work.
19/11/22 18:15:36 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 18:15:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 18:15:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 18:15:36 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestf_B5Wz/job_dd3aa8f9-260e-43a4-bcec-741c0bd5a7de/MANIFEST
19/11/22 18:15:36 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestf_B5Wz/job_dd3aa8f9-260e-43a4-bcec-741c0bd5a7de/MANIFEST -> 0 artifacts
19/11/22 18:15:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 18:15:37 INFO sdk_worker_main.main: Logging handler created.
19/11/22 18:15:37 INFO sdk_worker_main.start: Status HTTP server running at localhost:42889
19/11/22 18:15:37 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 18:15:37 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 18:15:37 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574446532.71_af5f0539-5805-4a48-bdd7-f7bfa0a7cce8', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/22 18:15:37 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574446532.71', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53241'}
19/11/22 18:15:37 INFO statecache.__init__: Creating state cache with size 0
19/11/22 18:15:37 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40897.
19/11/22 18:15:37 INFO sdk_worker.__init__: Control channel established.
19/11/22 18:15:37 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 18:15:37 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/22 18:15:37 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43331.
19/11/22 18:15:37 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 18:15:37 INFO data_plane.create_data_channel: Creating client data channel for localhost:33527
19/11/22 18:15:37 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 18:15:37 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 18:15:37 INFO sdk_worker.run: No more requests from control plane
19/11/22 18:15:37 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 18:15:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 18:15:37 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 18:15:37 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 18:15:37 INFO sdk_worker.run: Done consuming work.
19/11/22 18:15:37 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 18:15:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 18:15:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 18:15:37 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestf_B5Wz/job_dd3aa8f9-260e-43a4-bcec-741c0bd5a7de/MANIFEST
19/11/22 18:15:37 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestf_B5Wz/job_dd3aa8f9-260e-43a4-bcec-741c0bd5a7de/MANIFEST -> 0 artifacts
19/11/22 18:15:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 18:15:38 INFO sdk_worker_main.main: Logging handler created.
19/11/22 18:15:38 INFO sdk_worker_main.start: Status HTTP server running at localhost:37737
19/11/22 18:15:38 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 18:15:38 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 18:15:38 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574446532.71_af5f0539-5805-4a48-bdd7-f7bfa0a7cce8', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/22 18:15:38 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574446532.71', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53241'}
19/11/22 18:15:38 INFO statecache.__init__: Creating state cache with size 0
19/11/22 18:15:38 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40213.
19/11/22 18:15:38 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/22 18:15:38 INFO sdk_worker.__init__: Control channel established.
19/11/22 18:15:38 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 18:15:38 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46479.
19/11/22 18:15:38 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 18:15:38 INFO data_plane.create_data_channel: Creating client data channel for localhost:38601
19/11/22 18:15:38 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 18:15:38 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 18:15:38 INFO sdk_worker.run: No more requests from control plane
19/11/22 18:15:38 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 18:15:38 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 18:15:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 18:15:38 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 18:15:38 INFO sdk_worker.run: Done consuming work.
19/11/22 18:15:38 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 18:15:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 18:15:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 18:15:38 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574446532.71_af5f0539-5805-4a48-bdd7-f7bfa0a7cce8 finished.
19/11/22 18:15:38 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/22 18:15:38 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestf_B5Wz/job_dd3aa8f9-260e-43a4-bcec-741c0bd5a7de/MANIFEST has 0 artifact locations
19/11/22 18:15:38 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestf_B5Wz/job_dd3aa8f9-260e-43a4-bcec-741c0bd5a7de/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140457531651840)>

# Thread: <Thread(Thread-120, started daemon 140457523259136)>
BaseException: Timed out after 60 seconds.


======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <_MainThread(MainThread, started 140458310883072)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_ru# Thread: <Thread(wait_until_finish_read, started daemon 140457032017664)>

nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
# Thread: <Thread(Thread-126, started daemon 140457023624960)>

    return self._next()
# Thread: <_MainThread(MainThread, started 140458310883072)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-120, started daemon 140457523259136)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 140457531651840)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574446522.84_b1ff756d-0783-45e7-b15f-31a6cf181dec failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 345.520s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 27s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/fxpsidmuv3e5a

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1603

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1603/display/redirect>

Changes:


------------------------------------------
[...truncated 1.33 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 12:12:26 INFO sdk_worker.run: No more requests from control plane
19/11/22 12:12:26 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 12:12:26 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 12:12:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 12:12:26 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 12:12:26 INFO sdk_worker.run: Done consuming work.
19/11/22 12:12:26 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 12:12:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 12:12:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 12:12:26 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest0utPGt/job_97aeadb2-8ca1-426f-8d77-6a9fa7d3cf7a/MANIFEST
19/11/22 12:12:26 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest0utPGt/job_97aeadb2-8ca1-426f-8d77-6a9fa7d3cf7a/MANIFEST -> 0 artifacts
19/11/22 12:12:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 12:12:27 INFO sdk_worker_main.main: Logging handler created.
19/11/22 12:12:27 INFO sdk_worker_main.start: Status HTTP server running at localhost:41627
19/11/22 12:12:27 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 12:12:27 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 12:12:27 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574424744.69_c27e1a8b-e77c-4a77-ad30-bff81f34ae0b', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/22 12:12:27 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574424744.69', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49379'}
19/11/22 12:12:27 INFO statecache.__init__: Creating state cache with size 0
19/11/22 12:12:27 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33407.
19/11/22 12:12:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/22 12:12:27 INFO sdk_worker.__init__: Control channel established.
19/11/22 12:12:27 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 12:12:27 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33373.
19/11/22 12:12:27 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 12:12:27 INFO data_plane.create_data_channel: Creating client data channel for localhost:40191
19/11/22 12:12:27 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 12:12:27 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 12:12:27 INFO sdk_worker.run: No more requests from control plane
19/11/22 12:12:27 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 12:12:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 12:12:27 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 12:12:27 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 12:12:27 INFO sdk_worker.run: Done consuming work.
19/11/22 12:12:27 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 12:12:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 12:12:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 12:12:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest0utPGt/job_97aeadb2-8ca1-426f-8d77-6a9fa7d3cf7a/MANIFEST
19/11/22 12:12:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest0utPGt/job_97aeadb2-8ca1-426f-8d77-6a9fa7d3cf7a/MANIFEST -> 0 artifacts
19/11/22 12:12:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 12:12:28 INFO sdk_worker_main.main: Logging handler created.
19/11/22 12:12:28 INFO sdk_worker_main.start: Status HTTP server running at localhost:32867
19/11/22 12:12:28 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 12:12:28 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 12:12:28 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574424744.69_c27e1a8b-e77c-4a77-ad30-bff81f34ae0b', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/22 12:12:28 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574424744.69', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49379'}
19/11/22 12:12:28 INFO statecache.__init__: Creating state cache with size 0
19/11/22 12:12:28 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39581.
19/11/22 12:12:28 INFO sdk_worker.__init__: Control channel established.
19/11/22 12:12:28 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 12:12:28 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/22 12:12:28 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36429.
19/11/22 12:12:28 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 12:12:28 INFO data_plane.create_data_channel: Creating client data channel for localhost:35499
19/11/22 12:12:28 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 12:12:28 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 12:12:28 INFO sdk_worker.run: No more requests from control plane
19/11/22 12:12:28 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 12:12:28 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 12:12:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 12:12:28 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 12:12:28 INFO sdk_worker.run: Done consuming work.
19/11/22 12:12:28 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 12:12:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 12:12:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 12:12:28 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest0utPGt/job_97aeadb2-8ca1-426f-8d77-6a9fa7d3cf7a/MANIFEST
19/11/22 12:12:28 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest0utPGt/job_97aeadb2-8ca1-426f-8d77-6a9fa7d3cf7a/MANIFEST -> 0 artifacts
19/11/22 12:12:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 12:12:29 INFO sdk_worker_main.main: Logging handler created.
19/11/22 12:12:29 INFO sdk_worker_main.start: Status HTTP server running at localhost:36061
19/11/22 12:12:29 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 12:12:29 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 12:12:29 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574424744.69_c27e1a8b-e77c-4a77-ad30-bff81f34ae0b', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/22 12:12:29 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574424744.69', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49379'}
19/11/22 12:12:29 INFO statecache.__init__: Creating state cache with size 0
19/11/22 12:12:29 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39615.
19/11/22 12:12:29 INFO sdk_worker.__init__: Control channel established.
19/11/22 12:12:29 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 12:12:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/22 12:12:29 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37101.
19/11/22 12:12:29 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 12:12:29 INFO data_plane.create_data_channel: Creating client data channel for localhost:33003
19/11/22 12:12:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 12:12:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 12:12:29 INFO sdk_worker.run: No more requests from control plane
19/11/22 12:12:29 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 12:12:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 12:12:29 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 12:12:29 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 12:12:29 INFO sdk_worker.run: Done consuming work.
19/11/22 12:12:29 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 12:12:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 12:12:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 12:12:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest0utPGt/job_97aeadb2-8ca1-426f-8d77-6a9fa7d3cf7a/MANIFEST
19/11/22 12:12:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest0utPGt/job_97aeadb2-8ca1-426f-8d77-6a9fa7d3cf7a/MANIFEST -> 0 artifacts
19/11/22 12:12:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 12:12:30 INFO sdk_worker_main.main: Logging handler created.
19/11/22 12:12:30 INFO sdk_worker_main.start: Status HTTP server running at localhost:37979
19/11/22 12:12:30 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 12:12:30 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 12:12:30 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574424744.69_c27e1a8b-e77c-4a77-ad30-bff81f34ae0b', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/22 12:12:30 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574424744.69', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49379'}
19/11/22 12:12:30 INFO statecache.__init__: Creating state cache with size 0
19/11/22 12:12:30 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43469.
19/11/22 12:12:30 INFO sdk_worker.__init__: Control channel established.
19/11/22 12:12:30 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 12:12:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/22 12:12:30 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38133.
19/11/22 12:12:30 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 12:12:30 INFO data_plane.create_data_channel: Creating client data channel for localhost:39567
19/11/22 12:12:30 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 12:12:30 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 12:12:30 INFO sdk_worker.run: No more requests from control plane
19/11/22 12:12:30 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 12:12:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 12:12:30 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 12:12:30 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 12:12:30 INFO sdk_worker.run: Done consuming work.
19/11/22 12:12:30 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 12:12:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 12:12:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 12:12:30 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574424744.69_c27e1a8b-e77c-4a77-ad30-bff81f34ae0b finished.
19/11/22 12:12:30 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/22 12:12:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktest0utPGt/job_97aeadb2-8ca1-426f-8d77-6a9fa7d3cf7a/MANIFEST has 0 artifact locations
19/11/22 12:12:30 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktest0utPGt/job_97aeadb2-8ca1-426f-8d77-6a9fa7d3cf7a/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
==================== Timed out after 60 seconds. ====================
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================

# Thread: <Thread(wait_until_finish_read, started daemon 140596382807808)>

# Thread: <Thread(Thread-118, started daemon 140596653979392)>

# Thread: <_MainThread(MainThread, started 140597169370880)>
==================== Timed out after 60 seconds. ====================

ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140595880257280)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
# Thread: <Thread(Thread-124, started daemon 140595888649984)>

# Thread: <_MainThread(MainThread, started 140597169370880)>

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(Thread-118, started daemon 140596653979392)>

  File "apache_beam/runners/portability/portable_ru# Thread: <Thread(wait_until_finish_read, started daemon 140596382807808)>
nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574424735.56_e5a2fe6b-fefc-48ee-a78c-2d5e2a72ac19 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 347.751s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 52s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/4566kblotofm6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1602

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1602/display/redirect?page=changes>

Changes:

[33895511+aromanenko-dev] [BEAM-7636] Migrate SqsIO to AWS SDK V2 for Java


------------------------------------------
[...truncated 1.33 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 10:30:12 INFO sdk_worker.run: No more requests from control plane
19/11/22 10:30:12 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 10:30:12 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 10:30:12 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 10:30:12 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 10:30:12 INFO sdk_worker.run: Done consuming work.
19/11/22 10:30:12 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 10:30:12 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 10:30:12 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 10:30:12 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktesti6hAXj/job_2fd3aa43-c437-4bfc-9f08-74888861639e/MANIFEST
19/11/22 10:30:12 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktesti6hAXj/job_2fd3aa43-c437-4bfc-9f08-74888861639e/MANIFEST -> 0 artifacts
19/11/22 10:30:12 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 10:30:12 INFO sdk_worker_main.main: Logging handler created.
19/11/22 10:30:12 INFO sdk_worker_main.start: Status HTTP server running at localhost:38757
19/11/22 10:30:12 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 10:30:12 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 10:30:12 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574418610.24_d0066d68-6fc6-4df7-a764-5e1d3ac86879', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/22 10:30:12 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574418610.24', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54843'}
19/11/22 10:30:12 INFO statecache.__init__: Creating state cache with size 0
19/11/22 10:30:12 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43997.
19/11/22 10:30:12 INFO sdk_worker.__init__: Control channel established.
19/11/22 10:30:12 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 10:30:12 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/22 10:30:12 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36639.
19/11/22 10:30:12 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 10:30:12 INFO data_plane.create_data_channel: Creating client data channel for localhost:36345
19/11/22 10:30:12 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 10:30:13 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 10:30:13 INFO sdk_worker.run: No more requests from control plane
19/11/22 10:30:13 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 10:30:13 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 10:30:13 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 10:30:13 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 10:30:13 INFO sdk_worker.run: Done consuming work.
19/11/22 10:30:13 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 10:30:13 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 10:30:13 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 10:30:13 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktesti6hAXj/job_2fd3aa43-c437-4bfc-9f08-74888861639e/MANIFEST
19/11/22 10:30:13 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktesti6hAXj/job_2fd3aa43-c437-4bfc-9f08-74888861639e/MANIFEST -> 0 artifacts
19/11/22 10:30:13 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 10:30:13 INFO sdk_worker_main.main: Logging handler created.
19/11/22 10:30:13 INFO sdk_worker_main.start: Status HTTP server running at localhost:39379
19/11/22 10:30:13 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 10:30:13 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 10:30:13 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574418610.24_d0066d68-6fc6-4df7-a764-5e1d3ac86879', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/22 10:30:13 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574418610.24', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54843'}
19/11/22 10:30:13 INFO statecache.__init__: Creating state cache with size 0
19/11/22 10:30:13 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39941.
19/11/22 10:30:13 INFO sdk_worker.__init__: Control channel established.
19/11/22 10:30:13 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/22 10:30:13 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 10:30:13 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41239.
19/11/22 10:30:13 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 10:30:13 INFO data_plane.create_data_channel: Creating client data channel for localhost:36385
19/11/22 10:30:13 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 10:30:14 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 10:30:14 INFO sdk_worker.run: No more requests from control plane
19/11/22 10:30:14 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 10:30:14 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 10:30:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 10:30:14 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 10:30:14 INFO sdk_worker.run: Done consuming work.
19/11/22 10:30:14 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 10:30:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 10:30:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 10:30:14 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktesti6hAXj/job_2fd3aa43-c437-4bfc-9f08-74888861639e/MANIFEST
19/11/22 10:30:14 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktesti6hAXj/job_2fd3aa43-c437-4bfc-9f08-74888861639e/MANIFEST -> 0 artifacts
19/11/22 10:30:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 10:30:14 INFO sdk_worker_main.main: Logging handler created.
19/11/22 10:30:14 INFO sdk_worker_main.start: Status HTTP server running at localhost:37775
19/11/22 10:30:14 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 10:30:14 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 10:30:14 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574418610.24_d0066d68-6fc6-4df7-a764-5e1d3ac86879', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/22 10:30:14 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574418610.24', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54843'}
19/11/22 10:30:14 INFO statecache.__init__: Creating state cache with size 0
19/11/22 10:30:14 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39095.
19/11/22 10:30:14 INFO sdk_worker.__init__: Control channel established.
19/11/22 10:30:14 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 10:30:14 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/22 10:30:14 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34393.
19/11/22 10:30:14 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 10:30:14 INFO data_plane.create_data_channel: Creating client data channel for localhost:45029
19/11/22 10:30:14 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 10:30:14 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 10:30:14 INFO sdk_worker.run: No more requests from control plane
19/11/22 10:30:14 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 10:30:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 10:30:14 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 10:30:14 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 10:30:14 INFO sdk_worker.run: Done consuming work.
19/11/22 10:30:14 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 10:30:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 10:30:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 10:30:15 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktesti6hAXj/job_2fd3aa43-c437-4bfc-9f08-74888861639e/MANIFEST
19/11/22 10:30:15 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktesti6hAXj/job_2fd3aa43-c437-4bfc-9f08-74888861639e/MANIFEST -> 0 artifacts
19/11/22 10:30:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 10:30:15 INFO sdk_worker_main.main: Logging handler created.
19/11/22 10:30:15 INFO sdk_worker_main.start: Status HTTP server running at localhost:46479
19/11/22 10:30:15 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 10:30:15 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 10:30:15 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574418610.24_d0066d68-6fc6-4df7-a764-5e1d3ac86879', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/22 10:30:15 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574418610.24', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54843'}
19/11/22 10:30:15 INFO statecache.__init__: Creating state cache with size 0
19/11/22 10:30:15 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36051.
19/11/22 10:30:15 INFO sdk_worker.__init__: Control channel established.
19/11/22 10:30:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/22 10:30:15 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 10:30:15 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33143.
19/11/22 10:30:15 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 10:30:15 INFO data_plane.create_data_channel: Creating client data channel for localhost:38507
19/11/22 10:30:15 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 10:30:15 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 10:30:15 INFO sdk_worker.run: No more requests from control plane
19/11/22 10:30:15 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 10:30:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 10:30:15 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 10:30:15 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 10:30:15 INFO sdk_worker.run: Done consuming work.
19/11/22 10:30:15 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 10:30:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 10:30:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 10:30:15 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574418610.24_d0066d68-6fc6-4df7-a764-5e1d3ac86879 finished.
19/11/22 10:30:15 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/22 10:30:15 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktesti6hAXj/job_2fd3aa43-c437-4bfc-9f08-74888861639e/MANIFEST has 0 artifact locations
19/11/22 10:30:15 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktesti6hAXj/job_2fd3aa43-c437-4bfc-9f08-74888861639e/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
==================== Timed out after 60 seconds. ====================

Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 140499922974464)>

  File "apache_beam/runners/portability/portable_ru# Thread: <Thread(Thread-119, started daemon 140500413929216)>
nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140501193160448)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
# Thread: <Thread(wait_until_finish_read, started daemon 140499906189056)>

    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-125, started daemon 140499914581760)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-119, started daemon 140500413929216)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(wait_until_finish_read, started daemon 140499922974464)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <_MainThread(MainThread, started 140501193160448)>
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574418600.43_d8ff08c2-e097-4337-8e52-f6df4c29561f failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 327.364s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 49s
60 actionable tasks: 59 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/wvqdokfsru7iy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1601

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1601/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 06:12:49 INFO sdk_worker.run: No more requests from control plane
19/11/22 06:12:49 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 06:12:49 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 06:12:49 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 06:12:49 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 06:12:49 INFO sdk_worker.run: Done consuming work.
19/11/22 06:12:49 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 06:12:49 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 06:12:49 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 06:12:49 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4fYxiq/job_d662c67d-991b-444d-8079-80c80624d053/MANIFEST
19/11/22 06:12:49 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4fYxiq/job_d662c67d-991b-444d-8079-80c80624d053/MANIFEST -> 0 artifacts
19/11/22 06:12:49 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 06:12:49 INFO sdk_worker_main.main: Logging handler created.
19/11/22 06:12:49 INFO sdk_worker_main.start: Status HTTP server running at localhost:32893
19/11/22 06:12:49 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 06:12:49 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 06:12:49 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574403167.62_a174f674-df5f-4822-9ab6-79ce79ebb0d0', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/22 06:12:49 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574403167.62', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52485'}
19/11/22 06:12:50 INFO statecache.__init__: Creating state cache with size 0
19/11/22 06:12:50 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35899.
19/11/22 06:12:50 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/22 06:12:50 INFO sdk_worker.__init__: Control channel established.
19/11/22 06:12:50 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 06:12:50 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40639.
19/11/22 06:12:50 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 06:12:50 INFO data_plane.create_data_channel: Creating client data channel for localhost:39855
19/11/22 06:12:50 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 06:12:50 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 06:12:50 INFO sdk_worker.run: No more requests from control plane
19/11/22 06:12:50 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 06:12:50 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 06:12:50 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 06:12:50 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 06:12:50 INFO sdk_worker.run: Done consuming work.
19/11/22 06:12:50 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 06:12:50 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 06:12:50 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 06:12:50 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4fYxiq/job_d662c67d-991b-444d-8079-80c80624d053/MANIFEST
19/11/22 06:12:50 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4fYxiq/job_d662c67d-991b-444d-8079-80c80624d053/MANIFEST -> 0 artifacts
19/11/22 06:12:50 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 06:12:50 INFO sdk_worker_main.main: Logging handler created.
19/11/22 06:12:50 INFO sdk_worker_main.start: Status HTTP server running at localhost:33127
19/11/22 06:12:50 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 06:12:50 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 06:12:50 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574403167.62_a174f674-df5f-4822-9ab6-79ce79ebb0d0', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/22 06:12:50 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574403167.62', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52485'}
19/11/22 06:12:50 INFO statecache.__init__: Creating state cache with size 0
19/11/22 06:12:50 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43119.
19/11/22 06:12:50 INFO sdk_worker.__init__: Control channel established.
19/11/22 06:12:50 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 06:12:50 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/22 06:12:50 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38991.
19/11/22 06:12:50 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 06:12:50 INFO data_plane.create_data_channel: Creating client data channel for localhost:38837
19/11/22 06:12:50 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 06:12:50 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 06:12:50 INFO sdk_worker.run: No more requests from control plane
19/11/22 06:12:50 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 06:12:50 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 06:12:50 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 06:12:50 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 06:12:50 INFO sdk_worker.run: Done consuming work.
19/11/22 06:12:50 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 06:12:50 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 06:12:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 06:12:51 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4fYxiq/job_d662c67d-991b-444d-8079-80c80624d053/MANIFEST
19/11/22 06:12:51 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4fYxiq/job_d662c67d-991b-444d-8079-80c80624d053/MANIFEST -> 0 artifacts
19/11/22 06:12:51 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 06:12:51 INFO sdk_worker_main.main: Logging handler created.
19/11/22 06:12:51 INFO sdk_worker_main.start: Status HTTP server running at localhost:35363
19/11/22 06:12:51 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 06:12:51 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 06:12:51 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574403167.62_a174f674-df5f-4822-9ab6-79ce79ebb0d0', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/22 06:12:51 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574403167.62', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52485'}
19/11/22 06:12:51 INFO statecache.__init__: Creating state cache with size 0
19/11/22 06:12:51 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34633.
19/11/22 06:12:51 INFO sdk_worker.__init__: Control channel established.
19/11/22 06:12:51 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 06:12:51 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/22 06:12:51 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38039.
19/11/22 06:12:51 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 06:12:51 INFO data_plane.create_data_channel: Creating client data channel for localhost:44685
19/11/22 06:12:51 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 06:12:51 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 06:12:51 INFO sdk_worker.run: No more requests from control plane
19/11/22 06:12:51 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 06:12:51 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 06:12:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 06:12:51 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 06:12:51 INFO sdk_worker.run: Done consuming work.
19/11/22 06:12:51 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 06:12:51 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 06:12:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 06:12:51 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4fYxiq/job_d662c67d-991b-444d-8079-80c80624d053/MANIFEST
19/11/22 06:12:51 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4fYxiq/job_d662c67d-991b-444d-8079-80c80624d053/MANIFEST -> 0 artifacts
19/11/22 06:12:52 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 06:12:52 INFO sdk_worker_main.main: Logging handler created.
19/11/22 06:12:52 INFO sdk_worker_main.start: Status HTTP server running at localhost:45473
19/11/22 06:12:52 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 06:12:52 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 06:12:52 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574403167.62_a174f674-df5f-4822-9ab6-79ce79ebb0d0', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/22 06:12:52 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574403167.62', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52485'}
19/11/22 06:12:52 INFO statecache.__init__: Creating state cache with size 0
19/11/22 06:12:52 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41839.
19/11/22 06:12:52 INFO sdk_worker.__init__: Control channel established.
19/11/22 06:12:52 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 06:12:52 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/22 06:12:52 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38793.
19/11/22 06:12:52 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 06:12:52 INFO data_plane.create_data_channel: Creating client data channel for localhost:43913
19/11/22 06:12:52 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 06:12:52 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 06:12:52 INFO sdk_worker.run: No more requests from control plane
19/11/22 06:12:52 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 06:12:52 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 06:12:52 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 06:12:52 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 06:12:52 INFO sdk_worker.run: Done consuming work.
19/11/22 06:12:52 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 06:12:52 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 06:12:52 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 06:12:52 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574403167.62_a174f674-df5f-4822-9ab6-79ce79ebb0d0 finished.
19/11/22 06:12:52 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/22 06:12:52 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktest4fYxiq/job_d662c67d-991b-444d-8079-80c80624d053/MANIFEST has 0 artifact locations
19/11/22 06:12:52 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktest4fYxiq/job_d662c67d-991b-444d-8079-80c80624d053/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
==================== Timed out after 60 seconds. ====================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)

----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
# Thread: <Thread(wait_until_finish_read, started daemon 140543929603840)>

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_ru# Thread: <Thread(Thread-119, started daemon 140543937996544)>

nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 140545062631168)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 140543921211136)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-125, started daemon 140543912818432)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-119, started daemon 140543937996544)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140545062631168)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140543929603840)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574403159.03_64a8d7dc-d667-4ef6-b1a2-c4a849b7573e failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 318.809s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 59s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/6jxxuhgjll2di

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1600

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1600/display/redirect?page=changes>

Changes:

[github] [BEAM-7278, BEAM-2530] Add support for using a Java linkage testing tool


------------------------------------------
[...truncated 3.79 KB...]
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.

> Configure project :sdks:python:container
Found go 1.12 in /usr/bin/go, use it.

> Configure project :sdks:go
Found go 1.12 in /usr/bin/go, use it.

> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:spark:job-server:processResources NO-SOURCE
> Task :runners:spark:processResources
> Task :model:fn-execution:extractProto
> Task :model:job-management:extractProto

> Task :sdks:python:container:goPrepare
Use project GOPATH: <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/container/.gogradle/project_gopath>

> Task :sdks:go:goPrepare
Use project GOPATH: <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/go/.gogradle/project_gopath>

> Task :model:job-management:processResources
> Task :model:fn-execution:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :sdks:java:core:processResources
> Task :sdks:python:container:gofmt
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :model:pipeline:generateProto
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar

> Task :sdks:python:test-suites:portable:py2:setupVirtualenv
New python executable in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/bin/python>
Installing setuptools, pip, wheel...

> Task :model:job-management:extractIncludeProto
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:generateProto
> Task :model:fn-execution:generateProto
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes

> Task :sdks:python:test-suites:portable:py2:setupVirtualenv
done.
Running virtualenv with interpreter /usr/bin/python2.7
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support

> Task :model:pipeline:shadowJar

> Task :sdks:python:test-suites:portable:py2:setupVirtualenv
Collecting tox==3.11.1
  Using cached https://files.pythonhosted.org/packages/8b/38/71c2fe0c3915fc0e93bdd1bf8cd697be48cdacedbdcd438e0f0629c69024/tox-3.11.1-py2.py3-none-any.whl

> Task :model:job-management:shadowJar

> Task :sdks:go:resolveBuildDependencies
Resolving cloud.google.com/go: commit='4f6c921ec566a33844f4e7879b31cd8575a6982d', urls=[https://code.googlesource.com/gocloud]
Resolving github.com/Shopify/sarama: commit='541689b9f4212043471eb537fa72da507025d3ea', urls=[https://github.com/Shopify/sarama.git, git@github.com:Shopify/sarama.git]
Resolving github.com/armon/consul-api: commit='eb2c6b5be1b66bab83016e0b05f01b8d5496ffbd', urls=[https://github.com/armon/consul-api.git, git@github.com:armon/consul-api.git]

> Task :model:fn-execution:shadowJar
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes

> Task :sdks:python:test-suites:portable:py2:setupVirtualenv
Collecting grpcio-tools==1.3.5
  Using cached https://files.pythonhosted.org/packages/05/f6/0296e29b1bac6f85d2a8556d48adf825307f73109a3c2c17fb734292db0a/grpcio_tools-1.3.5-cp27-cp27mu-manylinux1_x86_64.whl
Collecting six<2,>=1.0.0
  Using cached https://files.pythonhosted.org/packages/65/26/32b8464df2a97e6dd1b656ed26b2c194606c16fe163c695a992b36c11cdf/six-1.13.0-py2.py3-none-any.whl
Requirement already satisfied, skipping upgrade: setuptools>=30.0.0 in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from tox==3.11.1) (41.6.0)
Collecting pluggy<1,>=0.3.0
  Using cached https://files.pythonhosted.org/packages/a0/28/85c7aa31b80d150b772fbe4a229487bc6644da9ccb7e427dd8cc60cb8a62/pluggy-0.13.1-py2.py3-none-any.whl
Collecting toml>=0.9.4
  Using cached https://files.pythonhosted.org/packages/a2/12/ced7105d2de62fa7c8fb5fce92cc4ce66b57c95fb875e9318dba7f8c5db0/toml-0.10.0-py2.py3-none-any.whl
Collecting virtualenv>=14.0.0
  Using cached https://files.pythonhosted.org/packages/c5/97/00dd42a0fc41e9016b23f07ec7f657f636cb672fad9cf72b80f8f65c6a46/virtualenv-16.7.7-py2.py3-none-any.whl
Collecting py<2,>=1.4.17
  Using cached https://files.pythonhosted.org/packages/76/bc/394ad449851729244a97857ee14d7cba61ddb268dce3db538ba2f2ba1f0f/py-1.8.0-py2.py3-none-any.whl
Processing /home/jenkins/.cache/pip/wheels/66/13/60/ef107438d90e4aad6320e3424e50cfce5e16d1e9aad6d38294/filelock-3.0.12-cp27-none-any.whl
Collecting grpcio>=1.3.5
  Using cached https://files.pythonhosted.org/packages/0c/47/35cc9f6fd43f8e5ed74fcc6dd8a0cb2e89c118dd3ef7a8ff25e65bf0909f/grpcio-1.25.0-cp27-cp27mu-manylinux2010_x86_64.whl
Collecting protobuf>=3.2.0
  Using cached https://files.pythonhosted.org/packages/c5/49/ffa7ab9c52ec56b535cffec3bc844254c073888e6d4aeee464671ac97480/protobuf-3.10.0-cp27-cp27mu-manylinux1_x86_64.whl
Collecting importlib-metadata>=0.12; python_version < "3.8"
  Using cached https://files.pythonhosted.org/packages/f6/d2/40b3fa882147719744e6aa50ac39cf7a22a913cbcba86a0371176c425a3b/importlib_metadata-0.23-py2.py3-none-any.whl
Collecting futures>=2.2.0; python_version < "3.2"
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting enum34>=1.0.4; python_version < "3.4"
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting contextlib2; python_version < "3"
  Using cached https://files.pythonhosted.org/packages/85/60/370352f7ef6aa96c52fb001831622f50f923c1d575427d021b8ab3311236/contextlib2-0.6.0.post1-py2.py3-none-any.whl
Collecting zipp>=0.5
  Using cached https://files.pythonhosted.org/packages/74/3d/1ee25a26411ba0401b43c6376d2316a71addcc72ef8690b101b4ea56d76a/zipp-0.6.0-py2.py3-none-any.whl
Collecting pathlib2; python_version == "3.4.*" or python_version < "3"
  Using cached https://files.pythonhosted.org/packages/e9/45/9c82d3666af4ef9f221cbb954e1d77ddbb513faf552aea6df5f37f1a4859/pathlib2-2.3.5-py2.py3-none-any.whl
Collecting configparser>=3.5; python_version < "3"
  Using cached https://files.pythonhosted.org/packages/7a/2a/95ed0501cf5d8709490b1d3a3f9b5cf340da6c433f896bbe9ce08dbe6785/configparser-4.0.2-py2.py3-none-any.whl
Collecting more-itertools
  Using cached https://files.pythonhosted.org/packages/2f/9d/dcfe59e213093695f108508af1214cf9cd95cc5489e46877ec5cb56369e5/more_itertools-5.0.0-py2-none-any.whl
Processing /home/jenkins/.cache/pip/wheels/91/95/75/19c98a91239878abbc7c59970abd3b4e0438a7dd5b61778335/scandir-1.10.0-cp27-cp27mu-linux_x86_64.whl
Installing collected packages: six, contextlib2, more-itertools, zipp, scandir, pathlib2, configparser, importlib-metadata, pluggy, toml, virtualenv, py, filelock, tox, futures, enum34, grpcio, protobuf, grpcio-tools
Successfully installed configparser-4.0.2 contextlib2-0.6.0.post1 enum34-1.1.6 filelock-3.0.12 futures-3.3.0 grpcio-1.25.0 grpcio-tools-1.3.5 importlib-metadata-0.23 more-itertools-5.0.0 pathlib2-2.3.5 pluggy-0.13.1 protobuf-3.10.0 py-1.8.0 scandir-1.10.0 six-1.13.0 toml-0.10.0 tox-3.11.1 virtualenv-16.7.7 zipp-0.6.0

> Task :sdks:java:core:shadowJar

> Task :sdks:go:resolveBuildDependencies
Resolving github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/cpuguy83/go-md2man: commit='dc9f53734905c233adfc09fd4f063dce63ce3daf', urls=[https://github.com/cpuguy83/go-md2man.git, git@github.com:cpuguy83/go-md2man.git]
Resolving github.com/davecgh/go-spew: commit='87df7c60d5820d0f8ae11afede5aa52325c09717', urls=[https://github.com/davecgh/go-spew.git, git@github.com:davecgh/go-spew.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/eapache/go-resiliency: commit='ef9aaa7ea8bd2448429af1a77cf41b2b3b34bdd6', urls=[https://github.com/eapache/go-resiliency.git, git@github.com:eapache/go-resiliency.git]
Resolving github.com/eapache/go-xerial-snappy: commit='bb955e01b9346ac19dc29eb16586c90ded99a98c', urls=[https://github.com/eapache/go-xerial-snappy.git, git@github.com:eapache/go-xerial-snappy.git]
Resolving github.com/eapache/queue: commit='44cc805cf13205b55f69e14bcb69867d1ae92f98', urls=[https://github.com/eapache/queue.git, git@github.com:eapache/queue.git]
Resolving github.com/fsnotify/fsnotify: commit='c2828203cd70a50dcccfb2761f8b1f8ceef9a8e9', urls=[https://github.com/fsnotify/fsnotify.git, git@github.com:fsnotify/fsnotify.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/golang/glog: commit='23def4e6c14b4da8ac2ed8007337bc5eb5007998', urls=[https://github.com/golang/glog.git, git@github.com:golang/glog.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/golang/mock: commit='b3e60bcdc577185fce3cf625fc96b62857ce5574', urls=[https://github.com/golang/mock.git, git@github.com:golang/mock.git]
Resolving github.com/golang/protobuf: commit='3a3da3a4e26776cc22a79ef46d5d58477532dede', urls=[https://github.com/golang/protobuf.git, git@github.com:golang/protobuf.git]
Resolving github.com/golang/snappy: commit='553a641470496b2327abcac10b36396bd98e45c9', urls=[https://github.com/golang/snappy.git, git@github.com:golang/snappy.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/google/go-cmp: commit='3af367b6b30c263d47e8895973edcca9a49cf029', urls=[https://github.com/google/go-cmp.git, git@github.com:google/go-cmp.git]
Resolving github.com/google/pprof: commit='a8f279b7952b27edbcb72e5a6c69ee9be4c8ad93', urls=[https://github.com/google/pprof.git, git@github.com:google/pprof.git]
Resolving github.com/googleapis/gax-go: commit='317e0006254c44a0ac427cc52a0e083ff0b9622f', urls=[https://github.com/googleapis/gax-go.git, git@github.com:googleapis/gax-go.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]

> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :vendor:sdks-java-extensions-protobuf:shadowJar
> Task :runners:core-construction-java:jar
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar

> Task :sdks:go:resolveBuildDependencies
Resolving github.com/hashicorp/hcl: commit='23c074d0eceb2b8a5bfdbb271ab780cde70f05a8', urls=[https://github.com/hashicorp/hcl.git, git@github.com:hashicorp/hcl.git]
Resolving github.com/ianlancetaylor/demangle: commit='4883227f66371e02c4948937d3e2be1664d9be38', urls=[https://github.com/ianlancetaylor/demangle.git, git@github.com:ianlancetaylor/demangle.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/kr/fs: commit='2788f0dbd16903de03cb8186e5c7d97b69ad387b', urls=[https://github.com/kr/fs.git, git@github.com:kr/fs.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/magiconair/properties: commit='49d762b9817ba1c2e9d0c69183c2b4a8b8f1d934', urls=[https://github.com/magiconair/properties.git, git@github.com:magiconair/properties.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/mitchellh/go-homedir: commit='b8bc1bf767474819792c23f32d8286a45736f1c6', urls=[https://github.com/mitchellh/go-homedir.git, git@github.com:mitchellh/go-homedir.git]
Resolving github.com/mitchellh/mapstructure: commit='a4e142e9c047c904fa2f1e144d9a84e6133024bc', urls=[https://github.com/mitchellh/mapstructure.git, git@github.com:mitchellh/mapstructure.git]
Resolving github.com/nightlyone/lockfile: commit='0ad87eef1443f64d3d8c50da647e2b1552851124', urls=[https://github.com/nightlyone/lockfile, git@github.com:nightlyone/lockfile.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/openzipkin/zipkin-go: commit='3741243b287094fda649c7f0fa74bd51f37dc122', urls=[https://github.com/openzipkin/zipkin-go.git, git@github.com:openzipkin/zipkin-go.git]
Resolving github.com/pelletier/go-toml: commit='acdc4509485b587f5e675510c4f2c63e90ff68a8', urls=[https://github.com/pelletier/go-toml.git, git@github.com:pelletier/go-toml.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/pierrec/lz4: commit='ed8d4cc3b461464e69798080a0092bd028910298', urls=[https://github.com/pierrec/lz4.git, git@github.com:pierrec/lz4.git]
Resolving github.com/pierrec/xxHash: commit='a0006b13c722f7f12368c00a3d3c2ae8a999a0c6', urls=[https://github.com/pierrec/xxHash.git, git@github.com:pierrec/xxHash.git]
Resolving github.com/pkg/errors: commit='30136e27e2ac8d167177e8a583aa4c3fea5be833', urls=[https://github.com/pkg/errors.git, git@github.com:pkg/errors.git]
Resolving github.com/pkg/sftp: commit='22e9c1ccc02fc1b9fa3264572e49109b68a86947', urls=[https://github.com/pkg/sftp.git, git@github.com:pkg/sftp.git]
Resolving github.com/prometheus/client_golang: commit='9bb6ab929dcbe1c8393cd9ef70387cb69811bd1c', urls=[https://github.com/prometheus/client_golang.git, git@github.com:prometheus/client_golang.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/prometheus/procfs: commit='cb4147076ac75738c9a7d279075a253c0cc5acbd', urls=[https://github.com/prometheus/procfs.git, git@github.com:prometheus/procfs.git]
Resolving github.com/rcrowley/go-metrics: commit='8732c616f52954686704c8645fe1a9d59e9df7c1', urls=[https://github.com/rcrowley/go-metrics.git, git@github.com:rcrowley/go-metrics.git]
Resolving github.com/cpuguy83/go-md2man: commit='dc9f53734905c233adfc09fd4f063dce63ce3daf', urls=[https://github.com/cpuguy83/go-md2man.git, git@github.com:cpuguy83/go-md2man.git]
Resolving cached github.com/cpuguy83/go-md2man: commit='dc9f53734905c233adfc09fd4f063dce63ce3daf', urls=[https://github.com/cpuguy83/go-md2man.git, git@github.com:cpuguy83/go-md2man.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/spf13/afero: commit='bb8f1927f2a9d3ab41c9340aa034f6b803f4359c', urls=[https://github.com/spf13/afero.git, git@github.com:spf13/afero.git]
Resolving github.com/spf13/cast: commit='acbeb36b902d72a7a4c18e8f3241075e7ab763e4', urls=[https://github.com/spf13/cast.git, git@github.com:spf13/cast.git]
Resolving github.com/spf13/cobra: commit='93959269ad99e80983c9ba742a7e01203a4c0e4f', urls=[https://github.com/spf13/cobra.git, git@github.com:spf13/cobra.git]
Resolving github.com/spf13/jwalterweatherman: commit='7c0cea34c8ece3fbeb2b27ab9b59511d360fb394', urls=[https://github.com/spf13/jwalterweatherman.git, git@github.com:spf13/jwalterweatherman.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/spf13/viper: commit='aafc9e6bc7b7bb53ddaa75a5ef49a17d6e654be5', urls=[https://github.com/spf13/viper.git, git@github.com:spf13/viper.git]
Resolving github.com/stathat/go: commit='74669b9f388d9d788c97399a0824adbfee78400e', urls=[https://github.com/stathat/go.git, git@github.com:stathat/go.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/xordataexchange/crypt: commit='b2862e3d0a775f18c7cfe02273500ae307b61218', urls=[https://github.com/xordataexchange/crypt.git, git@github.com:xordataexchange/crypt.git]
Resolving go.opencensus.io: commit='aa2b39d1618ef56ba156f27cfcdae9042f68f0bc', urls=[https://github.com/census-instrumentation/opencensus-go]
Resolving golang.org/x/crypto: commit='d9133f5469342136e669e85192a26056b587f503', urls=[https://go.googlesource.com/crypto]
Resolving golang.org/x/debug: commit='95515998a8a4bd7448134b2cb5971dbeb12e0b77', urls=[https://go.googlesource.com/debug]
Resolving golang.org/x/net: commit='2fb46b16b8dda405028c50f7c7f0f9dd1fa6bfb1', urls=[https://go.googlesource.com/net]
Resolving golang.org/x/oauth2: commit='a032972e28060ca4f5644acffae3dfc268cc09db', urls=[https://go.googlesource.com/oauth2]
Resolving golang.org/x/sync: commit='fd80eb99c8f653c847d294a001bdf2a3a6f768f5', urls=[https://go.googlesource.com/sync]

> Task :sdks:java:harness:shadowJar

> Task :sdks:go:resolveBuildDependencies
Resolving golang.org/x/sys: commit='37707fdb30a5b38865cfb95e5aab41707daec7fd', urls=[https://go.googlesource.com/sys]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]

> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :runners:spark:compileJava FROM-CACHE
> Task :runners:spark:classes
> Task :runners:spark:jar
> Task :runners:spark:job-server:compileJava NO-SOURCE
> Task :runners:spark:job-server:classes UP-TO-DATE
> Task :runners:spark:job-server:shadowJar
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1599

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1599/display/redirect>

Changes:


------------------------------------------
[...truncated 1.31 MB...]
19/11/22 00:19:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 00:19:16 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowed_pardo_state_timers_1574381951.15_c133f3bd-1354-415a-8cee-0933a5ca2cba finished.
19/11/22 00:19:16 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/22 00:19:16 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestUzlvsl/job_84b72ede-f5dd-4f18-9358-e2ad3f95618d/MANIFEST has 0 artifact locations
19/11/22 00:19:16 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestUzlvsl/job_84b72ede-f5dd-4f18-9358-e2ad3f95618d/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function lift_combiners at 0x7f2f6e7d9050> ====================
19/11/22 00:19:17 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job test_windowing_1574381956.39_632cbcd2-58b0-4335-afa1-47e99ae8279c
19/11/22 00:19:17 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation test_windowing_1574381956.39_632cbcd2-58b0-4335-afa1-47e99ae8279c
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/11/22 00:19:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/11/22 00:19:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/11/22 00:19:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1574381956.39_632cbcd2-58b0-4335-afa1-47e99ae8279c on Spark master local
19/11/22 00:19:17 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/11/22 00:19:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574381956.39_632cbcd2-58b0-4335-afa1-47e99ae8279c: Pipeline translated successfully. Computing outputs
19/11/22 00:19:17 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestUzlvsl/job_8eb2bab8-e509-4fae-814f-3565552f84c6/MANIFEST
19/11/22 00:19:17 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestUzlvsl/job_8eb2bab8-e509-4fae-814f-3565552f84c6/MANIFEST has 0 artifact locations
19/11/22 00:19:17 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestUzlvsl/job_8eb2bab8-e509-4fae-814f-3565552f84c6/MANIFEST -> 0 artifacts
19/11/22 00:19:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 00:19:17 INFO sdk_worker_main.main: Logging handler created.
19/11/22 00:19:17 INFO sdk_worker_main.start: Status HTTP server running at localhost:33757
19/11/22 00:19:17 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 00:19:17 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 00:19:17 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574381956.39_632cbcd2-58b0-4335-afa1-47e99ae8279c', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/22 00:19:17 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574381956.39', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:47663'}
19/11/22 00:19:17 INFO statecache.__init__: Creating state cache with size 0
19/11/22 00:19:17 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37803.
19/11/22 00:19:17 INFO sdk_worker.__init__: Control channel established.
19/11/22 00:19:17 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 00:19:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/11/22 00:19:17 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43941.
19/11/22 00:19:17 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 00:19:17 INFO data_plane.create_data_channel: Creating client data channel for localhost:37721
19/11/22 00:19:17 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 00:19:18 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 00:19:18 INFO sdk_worker.run: No more requests from control plane
19/11/22 00:19:18 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 00:19:18 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 00:19:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 00:19:18 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 00:19:18 INFO sdk_worker.run: Done consuming work.
19/11/22 00:19:18 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 00:19:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 00:19:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 00:19:18 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestUzlvsl/job_8eb2bab8-e509-4fae-814f-3565552f84c6/MANIFEST
19/11/22 00:19:18 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestUzlvsl/job_8eb2bab8-e509-4fae-814f-3565552f84c6/MANIFEST -> 0 artifacts
19/11/22 00:19:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 00:19:18 INFO sdk_worker_main.main: Logging handler created.
19/11/22 00:19:18 INFO sdk_worker_main.start: Status HTTP server running at localhost:46649
19/11/22 00:19:18 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 00:19:18 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 00:19:18 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574381956.39_632cbcd2-58b0-4335-afa1-47e99ae8279c', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/22 00:19:18 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574381956.39', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:47663'}
19/11/22 00:19:18 INFO statecache.__init__: Creating state cache with size 0
19/11/22 00:19:18 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42081.
19/11/22 00:19:18 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/22 00:19:18 INFO sdk_worker.__init__: Control channel established.
19/11/22 00:19:18 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 00:19:18 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37757.
19/11/22 00:19:18 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 00:19:18 INFO data_plane.create_data_channel: Creating client data channel for localhost:46293
19/11/22 00:19:18 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 00:19:18 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 00:19:18 INFO sdk_worker.run: No more requests from control plane
19/11/22 00:19:18 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 00:19:18 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 00:19:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 00:19:18 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 00:19:18 INFO sdk_worker.run: Done consuming work.
19/11/22 00:19:18 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 00:19:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 00:19:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 00:19:18 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestUzlvsl/job_8eb2bab8-e509-4fae-814f-3565552f84c6/MANIFEST
19/11/22 00:19:18 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestUzlvsl/job_8eb2bab8-e509-4fae-814f-3565552f84c6/MANIFEST -> 0 artifacts
19/11/22 00:19:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 00:19:19 INFO sdk_worker_main.main: Logging handler created.
19/11/22 00:19:19 INFO sdk_worker_main.start: Status HTTP server running at localhost:33387
19/11/22 00:19:19 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 00:19:19 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 00:19:19 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574381956.39_632cbcd2-58b0-4335-afa1-47e99ae8279c', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/22 00:19:19 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574381956.39', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:47663'}
19/11/22 00:19:19 INFO statecache.__init__: Creating state cache with size 0
19/11/22 00:19:19 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35853.
19/11/22 00:19:19 INFO sdk_worker.__init__: Control channel established.
19/11/22 00:19:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/22 00:19:19 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 00:19:19 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38525.
19/11/22 00:19:19 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 00:19:19 INFO data_plane.create_data_channel: Creating client data channel for localhost:39039
19/11/22 00:19:19 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 00:19:19 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 00:19:19 INFO sdk_worker.run: No more requests from control plane
19/11/22 00:19:19 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 00:19:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 00:19:19 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 00:19:19 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 00:19:19 INFO sdk_worker.run: Done consuming work.
19/11/22 00:19:19 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 00:19:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 00:19:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 00:19:19 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestUzlvsl/job_8eb2bab8-e509-4fae-814f-3565552f84c6/MANIFEST
19/11/22 00:19:19 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestUzlvsl/job_8eb2bab8-e509-4fae-814f-3565552f84c6/MANIFEST -> 0 artifacts
19/11/22 00:19:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 00:19:20 INFO sdk_worker_main.main: Logging handler created.
19/11/22 00:19:20 INFO sdk_worker_main.start: Status HTTP server running at localhost:34877
19/11/22 00:19:20 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 00:19:20 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 00:19:20 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574381956.39_632cbcd2-58b0-4335-afa1-47e99ae8279c', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/22 00:19:20 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574381956.39', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:47663'}
19/11/22 00:19:20 INFO statecache.__init__: Creating state cache with size 0
19/11/22 00:19:20 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36861.
19/11/22 00:19:20 INFO sdk_worker.__init__: Control channel established.
19/11/22 00:19:20 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/22 00:19:20 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 00:19:20 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42721.
19/11/22 00:19:20 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 00:19:20 INFO data_plane.create_data_channel: Creating client data channel for localhost:46349
19/11/22 00:19:20 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 00:19:20 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 00:19:20 INFO sdk_worker.run: No more requests from control plane
19/11/22 00:19:20 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 00:19:20 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 00:19:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 00:19:20 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 00:19:20 INFO sdk_worker.run: Done consuming work.
19/11/22 00:19:20 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 00:19:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 00:19:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 00:19:20 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestUzlvsl/job_8eb2bab8-e509-4fae-814f-3565552f84c6/MANIFEST
19/11/22 00:19:20 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestUzlvsl/job_8eb2bab8-e509-4fae-814f-3565552f84c6/MANIFEST -> 0 artifacts
19/11/22 00:19:21 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/22 00:19:21 INFO sdk_worker_main.main: Logging handler created.
19/11/22 00:19:21 INFO sdk_worker_main.start: Status HTTP server running at localhost:36301
19/11/22 00:19:21 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/22 00:19:21 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/22 00:19:21 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574381956.39_632cbcd2-58b0-4335-afa1-47e99ae8279c', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/22 00:19:21 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574381956.39', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:47663'}
19/11/22 00:19:21 INFO statecache.__init__: Creating state cache with size 0
19/11/22 00:19:21 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40719.
19/11/22 00:19:21 INFO sdk_worker.__init__: Control channel established.
19/11/22 00:19:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/22 00:19:21 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/22 00:19:21 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44259.
19/11/22 00:19:21 INFO sdk_worker.create_state_handler: State channel established.
19/11/22 00:19:21 INFO data_plane.create_data_channel: Creating client data channel for localhost:37001
19/11/22 00:19:21 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/22 00:19:21 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/22 00:19:21 INFO sdk_worker.run: No more requests from control plane
19/11/22 00:19:21 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/22 00:19:21 INFO data_plane.close: Closing all cached grpc data channels.
19/11/22 00:19:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 00:19:21 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/22 00:19:21 INFO sdk_worker.run: Done consuming work.
19/11/22 00:19:21 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/22 00:19:21 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/22 00:19:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/22 00:19:21 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574381956.39_632cbcd2-58b0-4335-afa1-47e99ae8279c finished.
19/11/22 00:19:21 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/22 00:19:21 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestUzlvsl/job_8eb2bab8-e509-4fae-814f-3565552f84c6/MANIFEST has 0 artifact locations
19/11/22 00:19:21 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestUzlvsl/job_8eb2bab8-e509-4fae-814f-3565552f84c6/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
==================== Timed out after 60 seconds. ====================
    raise BaseException(msg)

# Thread: <Thread(wait_until_finish_read, started daemon 139841277576960)>

BaseException: Timed out after 60 seconds.

======================================================================
# Thread: <Thread(Thread-118, started daemon 139841260791552)>

ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <_MainThread(MainThread, started 139842056808192)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574381947.92_4e13917d-e98d-4930-a3b7-a68a5cf8cbf9 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 272.122s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 53s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/edcvuzt4umccq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1598

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1598/display/redirect?page=changes>

Changes:

[pabloem] [BEAM-8016] Pipeline Graph (#10132)


------------------------------------------
[...truncated 1.32 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 22:08:20 INFO sdk_worker.run: No more requests from control plane
19/11/21 22:08:20 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 22:08:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 22:08:20 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 22:08:20 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 22:08:20 INFO sdk_worker.run: Done consuming work.
19/11/21 22:08:20 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 22:08:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 22:08:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 22:08:21 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestAv4PBQ/job_88a1e087-1580-4555-a6d8-44a3cdf8b5ff/MANIFEST
19/11/21 22:08:21 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestAv4PBQ/job_88a1e087-1580-4555-a6d8-44a3cdf8b5ff/MANIFEST -> 0 artifacts
19/11/21 22:08:21 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 22:08:21 INFO sdk_worker_main.main: Logging handler created.
19/11/21 22:08:21 INFO sdk_worker_main.start: Status HTTP server running at localhost:35329
19/11/21 22:08:21 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 22:08:21 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 22:08:21 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574374098.83_0ab38dad-8dc5-4d20-bc2d-8693b3c568da', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 22:08:21 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574374098.83', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46779'}
19/11/21 22:08:21 INFO statecache.__init__: Creating state cache with size 0
19/11/21 22:08:21 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45989.
19/11/21 22:08:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/21 22:08:21 INFO sdk_worker.__init__: Control channel established.
19/11/21 22:08:21 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 22:08:21 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42505.
19/11/21 22:08:21 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 22:08:21 INFO data_plane.create_data_channel: Creating client data channel for localhost:46615
19/11/21 22:08:21 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 22:08:21 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 22:08:21 INFO sdk_worker.run: No more requests from control plane
19/11/21 22:08:21 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 22:08:21 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 22:08:21 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 22:08:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 22:08:21 INFO sdk_worker.run: Done consuming work.
19/11/21 22:08:21 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 22:08:21 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 22:08:22 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 22:08:22 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestAv4PBQ/job_88a1e087-1580-4555-a6d8-44a3cdf8b5ff/MANIFEST
19/11/21 22:08:22 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestAv4PBQ/job_88a1e087-1580-4555-a6d8-44a3cdf8b5ff/MANIFEST -> 0 artifacts
19/11/21 22:08:22 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 22:08:22 INFO sdk_worker_main.main: Logging handler created.
19/11/21 22:08:22 INFO sdk_worker_main.start: Status HTTP server running at localhost:43335
19/11/21 22:08:22 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 22:08:22 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 22:08:22 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574374098.83_0ab38dad-8dc5-4d20-bc2d-8693b3c568da', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 22:08:22 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574374098.83', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46779'}
19/11/21 22:08:22 INFO statecache.__init__: Creating state cache with size 0
19/11/21 22:08:22 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40259.
19/11/21 22:08:22 INFO sdk_worker.__init__: Control channel established.
19/11/21 22:08:22 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 22:08:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/21 22:08:22 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34283.
19/11/21 22:08:22 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 22:08:22 INFO data_plane.create_data_channel: Creating client data channel for localhost:34453
19/11/21 22:08:22 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 22:08:23 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 22:08:23 INFO sdk_worker.run: No more requests from control plane
19/11/21 22:08:23 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 22:08:23 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 22:08:23 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 22:08:23 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 22:08:23 INFO sdk_worker.run: Done consuming work.
19/11/21 22:08:23 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 22:08:23 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 22:08:23 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 22:08:23 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestAv4PBQ/job_88a1e087-1580-4555-a6d8-44a3cdf8b5ff/MANIFEST
19/11/21 22:08:23 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestAv4PBQ/job_88a1e087-1580-4555-a6d8-44a3cdf8b5ff/MANIFEST -> 0 artifacts
19/11/21 22:08:24 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 22:08:24 INFO sdk_worker_main.main: Logging handler created.
19/11/21 22:08:24 INFO sdk_worker_main.start: Status HTTP server running at localhost:36731
19/11/21 22:08:24 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 22:08:24 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 22:08:24 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574374098.83_0ab38dad-8dc5-4d20-bc2d-8693b3c568da', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 22:08:24 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574374098.83', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46779'}
19/11/21 22:08:24 INFO statecache.__init__: Creating state cache with size 0
19/11/21 22:08:24 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44783.
19/11/21 22:08:24 INFO sdk_worker.__init__: Control channel established.
19/11/21 22:08:24 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 22:08:24 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/21 22:08:24 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41165.
19/11/21 22:08:24 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 22:08:24 INFO data_plane.create_data_channel: Creating client data channel for localhost:40209
19/11/21 22:08:24 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 22:08:24 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 22:08:24 INFO sdk_worker.run: No more requests from control plane
19/11/21 22:08:24 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 22:08:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 22:08:24 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 22:08:24 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 22:08:24 INFO sdk_worker.run: Done consuming work.
19/11/21 22:08:24 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 22:08:24 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 22:08:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 22:08:24 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestAv4PBQ/job_88a1e087-1580-4555-a6d8-44a3cdf8b5ff/MANIFEST
19/11/21 22:08:24 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestAv4PBQ/job_88a1e087-1580-4555-a6d8-44a3cdf8b5ff/MANIFEST -> 0 artifacts
19/11/21 22:08:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 22:08:25 INFO sdk_worker_main.main: Logging handler created.
19/11/21 22:08:25 INFO sdk_worker_main.start: Status HTTP server running at localhost:40867
19/11/21 22:08:25 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 22:08:25 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 22:08:25 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574374098.83_0ab38dad-8dc5-4d20-bc2d-8693b3c568da', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 22:08:25 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574374098.83', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46779'}
19/11/21 22:08:25 INFO statecache.__init__: Creating state cache with size 0
19/11/21 22:08:25 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43627.
19/11/21 22:08:25 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/21 22:08:25 INFO sdk_worker.__init__: Control channel established.
19/11/21 22:08:25 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 22:08:25 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35145.
19/11/21 22:08:25 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 22:08:25 INFO data_plane.create_data_channel: Creating client data channel for localhost:38433
19/11/21 22:08:25 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 22:08:25 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 22:08:25 INFO sdk_worker.run: No more requests from control plane
19/11/21 22:08:25 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 22:08:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 22:08:25 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 22:08:25 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 22:08:25 INFO sdk_worker.run: Done consuming work.
19/11/21 22:08:25 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 22:08:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 22:08:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 22:08:25 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574374098.83_0ab38dad-8dc5-4d20-bc2d-8693b3c568da finished.
19/11/21 22:08:25 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/21 22:08:25 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestAv4PBQ/job_88a1e087-1580-4555-a6d8-44a3cdf8b5ff/MANIFEST has 0 artifact locations
19/11/21 22:08:25 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestAv4PBQ/job_88a1e087-1580-4555-a6d8-44a3cdf8b5ff/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139954235623168)>
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)

# Thread: <Thread(Thread-118, started daemon 139954252408576)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <_MainThread(MainThread, started 139955234240256)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apach# Thread: <Thread(wait_until_finish_read, started daemon 139954218575616)>

e_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(Thread-123, started daemon 139954227230464)>

# Thread: <Thread(Thread-118, started daemon 139954252408576)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574374087.84_325e0d80-c5d2-4e16-bafc-232593073a28 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <_MainThread(MainThread, started 139955234240256)>

----------------------------------------------------------------------
Ran 38 tests in 357.980s

# Thread: <Thread(wait_until_finish_read, started daemon 139954235623168)>
FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 18s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://gradle.com/s/vozixcuixid5o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1597

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1597/display/redirect?page=changes>

Changes:

[github] [BEAM-8743] Add support for flat schemas in pubsub


------------------------------------------
[...truncated 1.33 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 21:50:21 INFO sdk_worker.run: No more requests from control plane
19/11/21 21:50:21 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 21:50:21 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 21:50:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 21:50:21 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 21:50:21 INFO sdk_worker.run: Done consuming work.
19/11/21 21:50:21 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 21:50:21 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 21:50:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 21:50:21 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestq9y3RN/job_0194f54e-4303-4bc2-833d-8dd8e04445a0/MANIFEST
19/11/21 21:50:21 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestq9y3RN/job_0194f54e-4303-4bc2-833d-8dd8e04445a0/MANIFEST -> 0 artifacts
19/11/21 21:50:21 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 21:50:21 INFO sdk_worker_main.main: Logging handler created.
19/11/21 21:50:21 INFO sdk_worker_main.start: Status HTTP server running at localhost:43405
19/11/21 21:50:21 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 21:50:21 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 21:50:21 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574373019.06_b60e892f-1ada-4b2b-a992-73dc87e574a5', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 21:50:21 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574373019.06', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43349'}
19/11/21 21:50:21 INFO statecache.__init__: Creating state cache with size 0
19/11/21 21:50:21 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46273.
19/11/21 21:50:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/21 21:50:21 INFO sdk_worker.__init__: Control channel established.
19/11/21 21:50:21 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 21:50:21 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45501.
19/11/21 21:50:21 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 21:50:21 INFO data_plane.create_data_channel: Creating client data channel for localhost:34277
19/11/21 21:50:21 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 21:50:22 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 21:50:22 INFO sdk_worker.run: No more requests from control plane
19/11/21 21:50:22 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 21:50:22 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 21:50:22 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 21:50:22 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 21:50:22 INFO sdk_worker.run: Done consuming work.
19/11/21 21:50:22 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 21:50:22 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 21:50:22 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 21:50:22 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestq9y3RN/job_0194f54e-4303-4bc2-833d-8dd8e04445a0/MANIFEST
19/11/21 21:50:22 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestq9y3RN/job_0194f54e-4303-4bc2-833d-8dd8e04445a0/MANIFEST -> 0 artifacts
19/11/21 21:50:22 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 21:50:22 INFO sdk_worker_main.main: Logging handler created.
19/11/21 21:50:22 INFO sdk_worker_main.start: Status HTTP server running at localhost:43483
19/11/21 21:50:22 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 21:50:22 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 21:50:22 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574373019.06_b60e892f-1ada-4b2b-a992-73dc87e574a5', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 21:50:22 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574373019.06', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43349'}
19/11/21 21:50:22 INFO statecache.__init__: Creating state cache with size 0
19/11/21 21:50:22 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38929.
19/11/21 21:50:22 INFO sdk_worker.__init__: Control channel established.
19/11/21 21:50:22 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 21:50:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/21 21:50:22 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:32849.
19/11/21 21:50:22 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 21:50:22 INFO data_plane.create_data_channel: Creating client data channel for localhost:45279
19/11/21 21:50:22 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 21:50:23 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 21:50:23 INFO sdk_worker.run: No more requests from control plane
19/11/21 21:50:23 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 21:50:23 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 21:50:23 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 21:50:23 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 21:50:23 INFO sdk_worker.run: Done consuming work.
19/11/21 21:50:23 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 21:50:23 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 21:50:23 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 21:50:23 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestq9y3RN/job_0194f54e-4303-4bc2-833d-8dd8e04445a0/MANIFEST
19/11/21 21:50:23 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestq9y3RN/job_0194f54e-4303-4bc2-833d-8dd8e04445a0/MANIFEST -> 0 artifacts
19/11/21 21:50:24 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 21:50:24 INFO sdk_worker_main.main: Logging handler created.
19/11/21 21:50:24 INFO sdk_worker_main.start: Status HTTP server running at localhost:37747
19/11/21 21:50:24 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 21:50:24 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 21:50:24 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574373019.06_b60e892f-1ada-4b2b-a992-73dc87e574a5', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 21:50:24 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574373019.06', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43349'}
19/11/21 21:50:24 INFO statecache.__init__: Creating state cache with size 0
19/11/21 21:50:24 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41967.
19/11/21 21:50:24 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/21 21:50:24 INFO sdk_worker.__init__: Control channel established.
19/11/21 21:50:24 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 21:50:24 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42403.
19/11/21 21:50:24 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 21:50:24 INFO data_plane.create_data_channel: Creating client data channel for localhost:37895
19/11/21 21:50:24 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 21:50:24 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 21:50:24 INFO sdk_worker.run: No more requests from control plane
19/11/21 21:50:24 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 21:50:24 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 21:50:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 21:50:24 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 21:50:24 INFO sdk_worker.run: Done consuming work.
19/11/21 21:50:24 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 21:50:24 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 21:50:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 21:50:24 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestq9y3RN/job_0194f54e-4303-4bc2-833d-8dd8e04445a0/MANIFEST
19/11/21 21:50:24 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestq9y3RN/job_0194f54e-4303-4bc2-833d-8dd8e04445a0/MANIFEST -> 0 artifacts
19/11/21 21:50:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 21:50:25 INFO sdk_worker_main.main: Logging handler created.
19/11/21 21:50:25 INFO sdk_worker_main.start: Status HTTP server running at localhost:35827
19/11/21 21:50:25 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 21:50:25 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 21:50:25 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574373019.06_b60e892f-1ada-4b2b-a992-73dc87e574a5', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 21:50:25 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574373019.06', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43349'}
19/11/21 21:50:25 INFO statecache.__init__: Creating state cache with size 0
19/11/21 21:50:25 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33373.
19/11/21 21:50:25 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/21 21:50:25 INFO sdk_worker.__init__: Control channel established.
19/11/21 21:50:25 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 21:50:25 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35673.
19/11/21 21:50:25 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 21:50:25 INFO data_plane.create_data_channel: Creating client data channel for localhost:42425
19/11/21 21:50:25 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 21:50:25 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 21:50:25 INFO sdk_worker.run: No more requests from control plane
19/11/21 21:50:25 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 21:50:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 21:50:25 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 21:50:25 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 21:50:25 INFO sdk_worker.run: Done consuming work.
19/11/21 21:50:25 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 21:50:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 21:50:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 21:50:25 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574373019.06_b60e892f-1ada-4b2b-a992-73dc87e574a5 finished.
19/11/21 21:50:25 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/21 21:50:25 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestq9y3RN/job_0194f54e-4303-4bc2-833d-8dd8e04445a0/MANIFEST has 0 artifact locations
19/11/21 21:50:25 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestq9y3RN/job_0194f54e-4303-4bc2-833d-8dd8e04445a0/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
==================== Timed out after 60 seconds. ====================

Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 140542884382464)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-119, started daemon 140542901167872)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 140543680399104)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(wait_until_finish_read, started daemon 140542394492672)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-124, started daemon 140542402885376)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apach# Thread: <_MainThread(MainThread, started 140543680399104)>

e_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(wait_until_finish_read, started daemon 140542884382464)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574373007.92_342a8f14-d1fd-4309-8a6f-3fd7aae65ff2 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <Thread(Thread-119, started daemon 140542901167872)>
----------------------------------------------------------------------
Ran 38 tests in 375.484s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 48s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/5drzjl2zoolbm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1596

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1596/display/redirect?page=changes>

Changes:

[github] [BEAM-8659] RowJsonTest should test serialization independently


------------------------------------------
[...truncated 1.33 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 21:01:14 INFO sdk_worker.run: No more requests from control plane
19/11/21 21:01:14 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 21:01:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 21:01:14 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 21:01:14 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 21:01:14 INFO sdk_worker.run: Done consuming work.
19/11/21 21:01:14 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 21:01:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 21:01:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 21:01:14 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest5PrG8T/job_1b88e8c5-8224-4190-bda9-d90750874200/MANIFEST
19/11/21 21:01:14 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest5PrG8T/job_1b88e8c5-8224-4190-bda9-d90750874200/MANIFEST -> 0 artifacts
19/11/21 21:01:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 21:01:14 INFO sdk_worker_main.main: Logging handler created.
19/11/21 21:01:14 INFO sdk_worker_main.start: Status HTTP server running at localhost:38221
19/11/21 21:01:14 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 21:01:14 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 21:01:14 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574370072.21_5498ee9d-f3ef-43ca-9b71-6b193e4f7680', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 21:01:14 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574370072.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50417'}
19/11/21 21:01:14 INFO statecache.__init__: Creating state cache with size 0
19/11/21 21:01:14 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42533.
19/11/21 21:01:14 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/21 21:01:14 INFO sdk_worker.__init__: Control channel established.
19/11/21 21:01:14 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 21:01:14 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40661.
19/11/21 21:01:14 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 21:01:14 INFO data_plane.create_data_channel: Creating client data channel for localhost:36431
19/11/21 21:01:14 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 21:01:15 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 21:01:15 INFO sdk_worker.run: No more requests from control plane
19/11/21 21:01:15 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 21:01:15 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 21:01:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 21:01:15 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 21:01:15 INFO sdk_worker.run: Done consuming work.
19/11/21 21:01:15 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 21:01:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 21:01:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 21:01:15 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest5PrG8T/job_1b88e8c5-8224-4190-bda9-d90750874200/MANIFEST
19/11/21 21:01:15 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest5PrG8T/job_1b88e8c5-8224-4190-bda9-d90750874200/MANIFEST -> 0 artifacts
19/11/21 21:01:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 21:01:15 INFO sdk_worker_main.main: Logging handler created.
19/11/21 21:01:15 INFO sdk_worker_main.start: Status HTTP server running at localhost:42599
19/11/21 21:01:15 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 21:01:15 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 21:01:15 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574370072.21_5498ee9d-f3ef-43ca-9b71-6b193e4f7680', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 21:01:15 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574370072.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50417'}
19/11/21 21:01:15 INFO statecache.__init__: Creating state cache with size 0
19/11/21 21:01:15 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42595.
19/11/21 21:01:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/21 21:01:15 INFO sdk_worker.__init__: Control channel established.
19/11/21 21:01:15 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 21:01:15 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34949.
19/11/21 21:01:15 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 21:01:15 INFO data_plane.create_data_channel: Creating client data channel for localhost:39243
19/11/21 21:01:15 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 21:01:16 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 21:01:16 INFO sdk_worker.run: No more requests from control plane
19/11/21 21:01:16 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 21:01:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 21:01:16 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 21:01:16 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 21:01:16 INFO sdk_worker.run: Done consuming work.
19/11/21 21:01:16 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 21:01:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 21:01:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 21:01:16 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest5PrG8T/job_1b88e8c5-8224-4190-bda9-d90750874200/MANIFEST
19/11/21 21:01:16 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest5PrG8T/job_1b88e8c5-8224-4190-bda9-d90750874200/MANIFEST -> 0 artifacts
19/11/21 21:01:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 21:01:16 INFO sdk_worker_main.main: Logging handler created.
19/11/21 21:01:16 INFO sdk_worker_main.start: Status HTTP server running at localhost:42249
19/11/21 21:01:16 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 21:01:16 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 21:01:16 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574370072.21_5498ee9d-f3ef-43ca-9b71-6b193e4f7680', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 21:01:16 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574370072.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50417'}
19/11/21 21:01:16 INFO statecache.__init__: Creating state cache with size 0
19/11/21 21:01:16 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39089.
19/11/21 21:01:16 INFO sdk_worker.__init__: Control channel established.
19/11/21 21:01:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/21 21:01:16 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 21:01:16 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44613.
19/11/21 21:01:16 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 21:01:16 INFO data_plane.create_data_channel: Creating client data channel for localhost:42395
19/11/21 21:01:16 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 21:01:16 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 21:01:16 INFO sdk_worker.run: No more requests from control plane
19/11/21 21:01:16 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 21:01:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 21:01:16 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 21:01:16 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 21:01:16 INFO sdk_worker.run: Done consuming work.
19/11/21 21:01:16 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 21:01:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 21:01:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 21:01:17 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest5PrG8T/job_1b88e8c5-8224-4190-bda9-d90750874200/MANIFEST
19/11/21 21:01:17 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest5PrG8T/job_1b88e8c5-8224-4190-bda9-d90750874200/MANIFEST -> 0 artifacts
19/11/21 21:01:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 21:01:17 INFO sdk_worker_main.main: Logging handler created.
19/11/21 21:01:17 INFO sdk_worker_main.start: Status HTTP server running at localhost:35627
19/11/21 21:01:17 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 21:01:17 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 21:01:17 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574370072.21_5498ee9d-f3ef-43ca-9b71-6b193e4f7680', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 21:01:17 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574370072.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50417'}
19/11/21 21:01:17 INFO statecache.__init__: Creating state cache with size 0
19/11/21 21:01:17 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41441.
19/11/21 21:01:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/21 21:01:17 INFO sdk_worker.__init__: Control channel established.
19/11/21 21:01:17 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 21:01:17 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45855.
19/11/21 21:01:17 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 21:01:17 INFO data_plane.create_data_channel: Creating client data channel for localhost:39537
19/11/21 21:01:17 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 21:01:17 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 21:01:17 INFO sdk_worker.run: No more requests from control plane
19/11/21 21:01:17 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 21:01:17 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 21:01:17 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 21:01:17 INFO sdk_worker.run: Done consuming work.
19/11/21 21:01:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 21:01:17 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 21:01:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 21:01:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 21:01:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574370072.21_5498ee9d-f3ef-43ca-9b71-6b193e4f7680 finished.
19/11/21 21:01:18 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/21 21:01:18 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktest5PrG8T/job_1b88e8c5-8224-4190-bda9-d90750874200/MANIFEST has 0 artifact locations
19/11/21 21:01:18 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktest5PrG8T/job_1b88e8c5-8224-4190-bda9-d90750874200/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
==================== Timed out after 60 seconds. ====================

    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
# Thread: <Thread(wait_until_finish_read, started daemon 139928699950848)>

    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
# Thread: <Thread(Thread-119, started daemon 139928079951616)>

BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <_MainThread(MainThread, started 139929215342336)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(lis==================== Timed out after 60 seconds. ====================

t(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(wait_until_finish_read, started daemon 139928063166208)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574370062.13_b2c6372f-fc08-43b7-ab00-c5e8bcb3d555 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <Thread(Thread-125, started daemon 139928071558912)>

----------------------------------------------------------------------
Ran 38 tests in 314.196s

# Thread: <_MainThread(MainThread, started 139929215342336)>

FAILED (errors=3, skipped=9)
# Thread: <Thread(Thread-119, started daemon 139928079951616)>

# Thread: <Thread(wait_until_finish_read, started daemon 139928699950848)>

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 58s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://gradle.com/s/vqahay4vzsyvi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1595

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1595/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 18:20:48 INFO sdk_worker.run: No more requests from control plane
19/11/21 18:20:48 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 18:20:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 18:20:48 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 18:20:48 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 18:20:48 INFO sdk_worker.run: Done consuming work.
19/11/21 18:20:48 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 18:20:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 18:20:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 18:20:48 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestQoE8wg/job_77dd1269-41fc-412a-9396-0b687c8b4273/MANIFEST
19/11/21 18:20:48 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestQoE8wg/job_77dd1269-41fc-412a-9396-0b687c8b4273/MANIFEST -> 0 artifacts
19/11/21 18:20:49 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 18:20:49 INFO sdk_worker_main.main: Logging handler created.
19/11/21 18:20:49 INFO sdk_worker_main.start: Status HTTP server running at localhost:36191
19/11/21 18:20:49 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 18:20:49 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 18:20:49 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574360446.61_e8fa7a73-56f3-4217-9121-ef987d13eb58', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 18:20:49 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574360446.61', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37999'}
19/11/21 18:20:49 INFO statecache.__init__: Creating state cache with size 0
19/11/21 18:20:49 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41749.
19/11/21 18:20:49 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/21 18:20:49 INFO sdk_worker.__init__: Control channel established.
19/11/21 18:20:49 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 18:20:49 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42321.
19/11/21 18:20:49 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 18:20:49 INFO data_plane.create_data_channel: Creating client data channel for localhost:42563
19/11/21 18:20:49 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 18:20:49 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 18:20:49 INFO sdk_worker.run: No more requests from control plane
19/11/21 18:20:49 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 18:20:49 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 18:20:49 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 18:20:49 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 18:20:49 INFO sdk_worker.run: Done consuming work.
19/11/21 18:20:49 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 18:20:49 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 18:20:49 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 18:20:49 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestQoE8wg/job_77dd1269-41fc-412a-9396-0b687c8b4273/MANIFEST
19/11/21 18:20:49 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestQoE8wg/job_77dd1269-41fc-412a-9396-0b687c8b4273/MANIFEST -> 0 artifacts
19/11/21 18:20:50 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 18:20:50 INFO sdk_worker_main.main: Logging handler created.
19/11/21 18:20:50 INFO sdk_worker_main.start: Status HTTP server running at localhost:37823
19/11/21 18:20:50 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 18:20:50 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 18:20:50 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574360446.61_e8fa7a73-56f3-4217-9121-ef987d13eb58', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 18:20:50 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574360446.61', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37999'}
19/11/21 18:20:50 INFO statecache.__init__: Creating state cache with size 0
19/11/21 18:20:50 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41505.
19/11/21 18:20:50 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/21 18:20:50 INFO sdk_worker.__init__: Control channel established.
19/11/21 18:20:50 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 18:20:50 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39367.
19/11/21 18:20:50 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 18:20:50 INFO data_plane.create_data_channel: Creating client data channel for localhost:40693
19/11/21 18:20:50 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 18:20:50 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 18:20:50 INFO sdk_worker.run: No more requests from control plane
19/11/21 18:20:50 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 18:20:50 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 18:20:50 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 18:20:50 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 18:20:50 INFO sdk_worker.run: Done consuming work.
19/11/21 18:20:50 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 18:20:50 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 18:20:50 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 18:20:50 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestQoE8wg/job_77dd1269-41fc-412a-9396-0b687c8b4273/MANIFEST
19/11/21 18:20:50 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestQoE8wg/job_77dd1269-41fc-412a-9396-0b687c8b4273/MANIFEST -> 0 artifacts
19/11/21 18:20:51 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 18:20:51 INFO sdk_worker_main.main: Logging handler created.
19/11/21 18:20:51 INFO sdk_worker_main.start: Status HTTP server running at localhost:38493
19/11/21 18:20:51 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 18:20:51 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 18:20:51 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574360446.61_e8fa7a73-56f3-4217-9121-ef987d13eb58', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 18:20:51 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574360446.61', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37999'}
19/11/21 18:20:51 INFO statecache.__init__: Creating state cache with size 0
19/11/21 18:20:51 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38039.
19/11/21 18:20:51 INFO sdk_worker.__init__: Control channel established.
19/11/21 18:20:51 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/21 18:20:51 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 18:20:51 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:32893.
19/11/21 18:20:51 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 18:20:51 INFO data_plane.create_data_channel: Creating client data channel for localhost:45571
19/11/21 18:20:51 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 18:20:51 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 18:20:51 INFO sdk_worker.run: No more requests from control plane
19/11/21 18:20:51 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 18:20:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 18:20:51 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 18:20:51 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 18:20:51 INFO sdk_worker.run: Done consuming work.
19/11/21 18:20:51 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 18:20:51 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 18:20:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 18:20:51 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestQoE8wg/job_77dd1269-41fc-412a-9396-0b687c8b4273/MANIFEST
19/11/21 18:20:51 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestQoE8wg/job_77dd1269-41fc-412a-9396-0b687c8b4273/MANIFEST -> 0 artifacts
19/11/21 18:20:52 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 18:20:52 INFO sdk_worker_main.main: Logging handler created.
19/11/21 18:20:52 INFO sdk_worker_main.start: Status HTTP server running at localhost:33125
19/11/21 18:20:52 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 18:20:52 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 18:20:52 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574360446.61_e8fa7a73-56f3-4217-9121-ef987d13eb58', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 18:20:52 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574360446.61', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37999'}
19/11/21 18:20:52 INFO statecache.__init__: Creating state cache with size 0
19/11/21 18:20:52 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39947.
19/11/21 18:20:52 INFO sdk_worker.__init__: Control channel established.
19/11/21 18:20:52 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/21 18:20:52 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 18:20:52 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42843.
19/11/21 18:20:52 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 18:20:52 INFO data_plane.create_data_channel: Creating client data channel for localhost:42693
19/11/21 18:20:52 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 18:20:52 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 18:20:52 INFO sdk_worker.run: No more requests from control plane
19/11/21 18:20:52 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 18:20:52 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 18:20:52 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 18:20:52 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 18:20:52 INFO sdk_worker.run: Done consuming work.
19/11/21 18:20:52 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 18:20:52 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 18:20:52 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 18:20:52 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574360446.61_e8fa7a73-56f3-4217-9121-ef987d13eb58 finished.
19/11/21 18:20:52 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/21 18:20:52 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestQoE8wg/job_77dd1269-41fc-412a-9396-0b687c8b4273/MANIFEST has 0 artifact locations
19/11/21 18:20:52 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestQoE8wg/job_77dd1269-41fc-412a-9396-0b687c8b4273/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
==================== Timed out after 60 seconds. ====================
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:

# Thread: <Thread(wait_until_finish_read, started daemon 140316581558016)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-118, started daemon 140316927649536)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 140317706880768)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140316564772608)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
# Thread: <Thread(Thread-124, started daemon 140316556379904)>

    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
# Thread: <Thread(Thread-118, started daemon 140316927649536)>

# Thread: <Thread(wait_until_finish_read, started daemon 140316581558016)>

# Thread: <_MainThread(MainThread, started 140317706880768)>
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574360436.07_a8705a54-7677-49a0-a802-ff035ce9dee6 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 335.697s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 24s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/4n3r6vcwy5aio

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1594

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1594/display/redirect?page=changes>

Changes:

[bhulette] Fix inverted timeout check in TestPubsub

[bhulette] Add empty check before acknowledge


------------------------------------------
[...truncated 1.32 MB...]
19/11/21 17:18:45 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 17:18:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:40915
19/11/21 17:18:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 17:18:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 17:18:45 INFO sdk_worker.run: No more requests from control plane
19/11/21 17:18:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 17:18:45 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 17:18:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 17:18:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 17:18:45 INFO sdk_worker.run: Done consuming work.
19/11/21 17:18:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 17:18:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 17:18:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 17:18:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestS4RsrF/job_bd826526-6af5-48f8-ac7a-edefdb3560ce/MANIFEST
19/11/21 17:18:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestS4RsrF/job_bd826526-6af5-48f8-ac7a-edefdb3560ce/MANIFEST -> 0 artifacts
19/11/21 17:18:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 17:18:46 INFO sdk_worker_main.main: Logging handler created.
19/11/21 17:18:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:36065
19/11/21 17:18:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 17:18:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 17:18:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574356723.69_77af0195-db27-430e-9077-1dcbb6ae0f83', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 17:18:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574356723.69', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40919'}
19/11/21 17:18:46 INFO statecache.__init__: Creating state cache with size 0
19/11/21 17:18:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38111.
19/11/21 17:18:46 INFO sdk_worker.__init__: Control channel established.
19/11/21 17:18:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/21 17:18:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 17:18:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40847.
19/11/21 17:18:46 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 17:18:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:39805
19/11/21 17:18:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 17:18:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 17:18:46 INFO sdk_worker.run: No more requests from control plane
19/11/21 17:18:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 17:18:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 17:18:46 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 17:18:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 17:18:46 INFO sdk_worker.run: Done consuming work.
19/11/21 17:18:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 17:18:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 17:18:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 17:18:46 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestS4RsrF/job_bd826526-6af5-48f8-ac7a-edefdb3560ce/MANIFEST
19/11/21 17:18:46 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestS4RsrF/job_bd826526-6af5-48f8-ac7a-edefdb3560ce/MANIFEST -> 0 artifacts
19/11/21 17:18:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 17:18:46 INFO sdk_worker_main.main: Logging handler created.
19/11/21 17:18:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:46105
19/11/21 17:18:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 17:18:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 17:18:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574356723.69_77af0195-db27-430e-9077-1dcbb6ae0f83', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 17:18:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574356723.69', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40919'}
19/11/21 17:18:46 INFO statecache.__init__: Creating state cache with size 0
19/11/21 17:18:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38629.
19/11/21 17:18:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/21 17:18:46 INFO sdk_worker.__init__: Control channel established.
19/11/21 17:18:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 17:18:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37415.
19/11/21 17:18:46 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 17:18:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:37657
19/11/21 17:18:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 17:18:47 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 17:18:47 INFO sdk_worker.run: No more requests from control plane
19/11/21 17:18:47 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 17:18:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 17:18:47 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 17:18:47 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 17:18:47 INFO sdk_worker.run: Done consuming work.
19/11/21 17:18:47 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 17:18:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 17:18:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 17:18:47 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestS4RsrF/job_bd826526-6af5-48f8-ac7a-edefdb3560ce/MANIFEST
19/11/21 17:18:47 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestS4RsrF/job_bd826526-6af5-48f8-ac7a-edefdb3560ce/MANIFEST -> 0 artifacts
19/11/21 17:18:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 17:18:47 INFO sdk_worker_main.main: Logging handler created.
19/11/21 17:18:47 INFO sdk_worker_main.start: Status HTTP server running at localhost:35467
19/11/21 17:18:47 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 17:18:47 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 17:18:47 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574356723.69_77af0195-db27-430e-9077-1dcbb6ae0f83', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 17:18:47 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574356723.69', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40919'}
19/11/21 17:18:47 INFO statecache.__init__: Creating state cache with size 0
19/11/21 17:18:47 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34583.
19/11/21 17:18:47 INFO sdk_worker.__init__: Control channel established.
19/11/21 17:18:47 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 17:18:47 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/21 17:18:47 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39653.
19/11/21 17:18:47 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 17:18:47 INFO data_plane.create_data_channel: Creating client data channel for localhost:45493
19/11/21 17:18:47 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 17:18:47 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 17:18:47 INFO sdk_worker.run: No more requests from control plane
19/11/21 17:18:47 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 17:18:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 17:18:47 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 17:18:47 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 17:18:47 INFO sdk_worker.run: Done consuming work.
19/11/21 17:18:47 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 17:18:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 17:18:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 17:18:47 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestS4RsrF/job_bd826526-6af5-48f8-ac7a-edefdb3560ce/MANIFEST
19/11/21 17:18:47 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestS4RsrF/job_bd826526-6af5-48f8-ac7a-edefdb3560ce/MANIFEST -> 0 artifacts
19/11/21 17:18:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 17:18:48 INFO sdk_worker_main.main: Logging handler created.
19/11/21 17:18:48 INFO sdk_worker_main.start: Status HTTP server running at localhost:43519
19/11/21 17:18:48 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 17:18:48 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 17:18:48 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574356723.69_77af0195-db27-430e-9077-1dcbb6ae0f83', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 17:18:48 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574356723.69', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40919'}
19/11/21 17:18:48 INFO statecache.__init__: Creating state cache with size 0
19/11/21 17:18:48 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39787.
19/11/21 17:18:48 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/21 17:18:48 INFO sdk_worker.__init__: Control channel established.
19/11/21 17:18:48 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 17:18:48 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33141.
19/11/21 17:18:48 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 17:18:48 INFO data_plane.create_data_channel: Creating client data channel for localhost:44665
19/11/21 17:18:48 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 17:18:48 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 17:18:48 INFO sdk_worker.run: No more requests from control plane
19/11/21 17:18:48 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 17:18:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 17:18:48 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 17:18:48 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 17:18:48 INFO sdk_worker.run: Done consuming work.
19/11/21 17:18:48 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 17:18:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 17:18:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 17:18:48 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574356723.69_77af0195-db27-430e-9077-1dcbb6ae0f83 finished.
19/11/21 17:18:48 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/21 17:18:48 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestS4RsrF/job_bd826526-6af5-48f8-ac7a-edefdb3560ce/MANIFEST has 0 artifact locations
19/11/21 17:18:48 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestS4RsrF/job_bd826526-6af5-48f8-ac7a-edefdb3560ce/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139821385250560)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
# Thread: <Thread(Thread-118, started daemon 139821467653888)>

# Thread: <_MainThread(MainThread, started 139822246885120)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139821368465152)>

# Thread: <Thread(Thread-124, started daemon 139821360072448)>

# Thread: <_MainThread(MainThread, started 139822246885120)>
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574356714.93_a60d8690-8832-42a7-a591-0c1bee2ca0c3 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 297.043s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 32s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/dwoqpv33j5p64

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1593

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1593/display/redirect?page=changes>

Changes:

[echauchot] [BEAM-8470] Enable custom window tests in new spark runner validates


------------------------------------------
[...truncated 1.32 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 13:54:03 INFO sdk_worker.run: No more requests from control plane
19/11/21 13:54:03 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 13:54:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 13:54:03 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 13:54:03 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 13:54:03 INFO sdk_worker.run: Done consuming work.
19/11/21 13:54:03 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 13:54:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 13:54:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 13:54:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestwPWHQr/job_b69b615a-7319-436c-a800-d938d906d49c/MANIFEST
19/11/21 13:54:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestwPWHQr/job_b69b615a-7319-436c-a800-d938d906d49c/MANIFEST -> 0 artifacts
19/11/21 13:54:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 13:54:04 INFO sdk_worker_main.main: Logging handler created.
19/11/21 13:54:04 INFO sdk_worker_main.start: Status HTTP server running at localhost:35335
19/11/21 13:54:04 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 13:54:04 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 13:54:04 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574344442.16_3f2086f8-aea9-4b2a-80d2-a11b6ee7a392', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 13:54:04 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574344442.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42349'}
19/11/21 13:54:04 INFO statecache.__init__: Creating state cache with size 0
19/11/21 13:54:04 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37045.
19/11/21 13:54:04 INFO sdk_worker.__init__: Control channel established.
19/11/21 13:54:04 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 13:54:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/21 13:54:04 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37207.
19/11/21 13:54:04 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 13:54:04 INFO data_plane.create_data_channel: Creating client data channel for localhost:45979
19/11/21 13:54:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 13:54:04 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 13:54:04 INFO sdk_worker.run: No more requests from control plane
19/11/21 13:54:04 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 13:54:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 13:54:04 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 13:54:04 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 13:54:04 INFO sdk_worker.run: Done consuming work.
19/11/21 13:54:04 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 13:54:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 13:54:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 13:54:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestwPWHQr/job_b69b615a-7319-436c-a800-d938d906d49c/MANIFEST
19/11/21 13:54:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestwPWHQr/job_b69b615a-7319-436c-a800-d938d906d49c/MANIFEST -> 0 artifacts
19/11/21 13:54:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 13:54:05 INFO sdk_worker_main.main: Logging handler created.
19/11/21 13:54:05 INFO sdk_worker_main.start: Status HTTP server running at localhost:40909
19/11/21 13:54:05 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 13:54:05 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 13:54:05 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574344442.16_3f2086f8-aea9-4b2a-80d2-a11b6ee7a392', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 13:54:05 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574344442.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42349'}
19/11/21 13:54:05 INFO statecache.__init__: Creating state cache with size 0
19/11/21 13:54:05 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42243.
19/11/21 13:54:05 INFO sdk_worker.__init__: Control channel established.
19/11/21 13:54:05 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 13:54:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/21 13:54:05 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39659.
19/11/21 13:54:05 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 13:54:05 INFO data_plane.create_data_channel: Creating client data channel for localhost:45351
19/11/21 13:54:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 13:54:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 13:54:05 INFO sdk_worker.run: No more requests from control plane
19/11/21 13:54:05 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 13:54:05 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 13:54:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 13:54:05 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 13:54:05 INFO sdk_worker.run: Done consuming work.
19/11/21 13:54:05 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 13:54:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 13:54:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 13:54:05 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestwPWHQr/job_b69b615a-7319-436c-a800-d938d906d49c/MANIFEST
19/11/21 13:54:05 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestwPWHQr/job_b69b615a-7319-436c-a800-d938d906d49c/MANIFEST -> 0 artifacts
19/11/21 13:54:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 13:54:06 INFO sdk_worker_main.main: Logging handler created.
19/11/21 13:54:06 INFO sdk_worker_main.start: Status HTTP server running at localhost:33133
19/11/21 13:54:06 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 13:54:06 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 13:54:06 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574344442.16_3f2086f8-aea9-4b2a-80d2-a11b6ee7a392', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 13:54:06 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574344442.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42349'}
19/11/21 13:54:06 INFO statecache.__init__: Creating state cache with size 0
19/11/21 13:54:06 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45675.
19/11/21 13:54:06 INFO sdk_worker.__init__: Control channel established.
19/11/21 13:54:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/21 13:54:06 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 13:54:06 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44551.
19/11/21 13:54:06 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 13:54:06 INFO data_plane.create_data_channel: Creating client data channel for localhost:43849
19/11/21 13:54:06 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 13:54:06 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 13:54:06 INFO sdk_worker.run: No more requests from control plane
19/11/21 13:54:06 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 13:54:06 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 13:54:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 13:54:06 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 13:54:06 INFO sdk_worker.run: Done consuming work.
19/11/21 13:54:06 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 13:54:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 13:54:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 13:54:06 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestwPWHQr/job_b69b615a-7319-436c-a800-d938d906d49c/MANIFEST
19/11/21 13:54:06 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestwPWHQr/job_b69b615a-7319-436c-a800-d938d906d49c/MANIFEST -> 0 artifacts
19/11/21 13:54:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 13:54:07 INFO sdk_worker_main.main: Logging handler created.
19/11/21 13:54:07 INFO sdk_worker_main.start: Status HTTP server running at localhost:39223
19/11/21 13:54:07 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 13:54:07 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 13:54:07 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574344442.16_3f2086f8-aea9-4b2a-80d2-a11b6ee7a392', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 13:54:07 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574344442.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42349'}
19/11/21 13:54:07 INFO statecache.__init__: Creating state cache with size 0
19/11/21 13:54:07 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46513.
19/11/21 13:54:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/21 13:54:07 INFO sdk_worker.__init__: Control channel established.
19/11/21 13:54:07 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 13:54:07 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44773.
19/11/21 13:54:07 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 13:54:07 INFO data_plane.create_data_channel: Creating client data channel for localhost:36837
19/11/21 13:54:07 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 13:54:07 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 13:54:07 INFO sdk_worker.run: No more requests from control plane
19/11/21 13:54:07 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 13:54:07 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 13:54:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 13:54:07 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 13:54:07 INFO sdk_worker.run: Done consuming work.
19/11/21 13:54:07 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 13:54:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 13:54:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 13:54:07 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574344442.16_3f2086f8-aea9-4b2a-80d2-a11b6ee7a392 finished.
19/11/21 13:54:07 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/21 13:54:07 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestwPWHQr/job_b69b615a-7319-436c-a800-d938d906d49c/MANIFEST has 0 artifact locations
19/11/21 13:54:07 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestwPWHQr/job_b69b615a-7319-436c-a800-d938d906d49c/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140274420406016)>
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once

# Thread: <Thread(Thread-120, started daemon 140274428798720)>

    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140275414267648)>
==================== Timed out after 60 seconds. ====================

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140274395227904)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py",# Thread: <Thread(Thread-125, started daemon 140274403620608)>

 line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <_MainThread(MainThread, started 140275414267648)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574344433.15_b2b5998e-4c0c-4422-b5a6-a4cae286e729 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <Thread(Thread-120, started daemon 140274428798720)>

----------------------------------------------------------------------
Ran 38 tests in 302.527s

# Thread: <Thread(wait_until_finish_read, started daemon 140274420406016)>
FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 56s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/5trjlxcolgq36

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1592

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1592/display/redirect>

Changes:


------------------------------------------
[...truncated 1.33 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 12:14:45 INFO sdk_worker.run: No more requests from control plane
19/11/21 12:14:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 12:14:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 12:14:45 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 12:14:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 12:14:45 INFO sdk_worker.run: Done consuming work.
19/11/21 12:14:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 12:14:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 12:14:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 12:14:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest9FCBxi/job_ac60db70-cbcc-499b-97de-f1b4c1f0f089/MANIFEST
19/11/21 12:14:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest9FCBxi/job_ac60db70-cbcc-499b-97de-f1b4c1f0f089/MANIFEST -> 0 artifacts
19/11/21 12:14:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 12:14:46 INFO sdk_worker_main.main: Logging handler created.
19/11/21 12:14:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:44341
19/11/21 12:14:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 12:14:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 12:14:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574338483.9_13d21a99-c3ab-4faf-bada-625e60c588f7', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 12:14:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574338483.9', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59249'}
19/11/21 12:14:46 INFO statecache.__init__: Creating state cache with size 0
19/11/21 12:14:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36777.
19/11/21 12:14:46 INFO sdk_worker.__init__: Control channel established.
19/11/21 12:14:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 12:14:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/21 12:14:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41757.
19/11/21 12:14:46 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 12:14:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 12:14:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:36829
19/11/21 12:14:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 12:14:46 INFO sdk_worker.run: No more requests from control plane
19/11/21 12:14:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 12:14:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 12:14:46 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 12:14:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 12:14:46 INFO sdk_worker.run: Done consuming work.
19/11/21 12:14:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 12:14:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 12:14:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 12:14:46 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest9FCBxi/job_ac60db70-cbcc-499b-97de-f1b4c1f0f089/MANIFEST
19/11/21 12:14:46 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest9FCBxi/job_ac60db70-cbcc-499b-97de-f1b4c1f0f089/MANIFEST -> 0 artifacts
19/11/21 12:14:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 12:14:47 INFO sdk_worker_main.main: Logging handler created.
19/11/21 12:14:47 INFO sdk_worker_main.start: Status HTTP server running at localhost:35471
19/11/21 12:14:47 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 12:14:47 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 12:14:47 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574338483.9_13d21a99-c3ab-4faf-bada-625e60c588f7', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 12:14:47 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574338483.9', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59249'}
19/11/21 12:14:47 INFO statecache.__init__: Creating state cache with size 0
19/11/21 12:14:47 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39187.
19/11/21 12:14:47 INFO sdk_worker.__init__: Control channel established.
19/11/21 12:14:47 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 12:14:47 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/21 12:14:47 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38011.
19/11/21 12:14:47 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 12:14:47 INFO data_plane.create_data_channel: Creating client data channel for localhost:43475
19/11/21 12:14:47 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 12:14:47 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 12:14:47 INFO sdk_worker.run: No more requests from control plane
19/11/21 12:14:47 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 12:14:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 12:14:47 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 12:14:47 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 12:14:47 INFO sdk_worker.run: Done consuming work.
19/11/21 12:14:47 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 12:14:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 12:14:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 12:14:47 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest9FCBxi/job_ac60db70-cbcc-499b-97de-f1b4c1f0f089/MANIFEST
19/11/21 12:14:47 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest9FCBxi/job_ac60db70-cbcc-499b-97de-f1b4c1f0f089/MANIFEST -> 0 artifacts
19/11/21 12:14:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 12:14:48 INFO sdk_worker_main.main: Logging handler created.
19/11/21 12:14:48 INFO sdk_worker_main.start: Status HTTP server running at localhost:34753
19/11/21 12:14:48 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 12:14:48 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 12:14:48 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574338483.9_13d21a99-c3ab-4faf-bada-625e60c588f7', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 12:14:48 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574338483.9', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59249'}
19/11/21 12:14:48 INFO statecache.__init__: Creating state cache with size 0
19/11/21 12:14:48 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33959.
19/11/21 12:14:48 INFO sdk_worker.__init__: Control channel established.
19/11/21 12:14:48 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 12:14:48 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/21 12:14:48 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35395.
19/11/21 12:14:48 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 12:14:48 INFO data_plane.create_data_channel: Creating client data channel for localhost:36923
19/11/21 12:14:48 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 12:14:48 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 12:14:48 INFO sdk_worker.run: No more requests from control plane
19/11/21 12:14:48 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 12:14:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 12:14:48 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 12:14:48 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 12:14:48 INFO sdk_worker.run: Done consuming work.
19/11/21 12:14:48 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 12:14:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 12:14:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 12:14:48 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest9FCBxi/job_ac60db70-cbcc-499b-97de-f1b4c1f0f089/MANIFEST
19/11/21 12:14:48 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest9FCBxi/job_ac60db70-cbcc-499b-97de-f1b4c1f0f089/MANIFEST -> 0 artifacts
19/11/21 12:14:49 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 12:14:49 INFO sdk_worker_main.main: Logging handler created.
19/11/21 12:14:49 INFO sdk_worker_main.start: Status HTTP server running at localhost:46333
19/11/21 12:14:49 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 12:14:49 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 12:14:49 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574338483.9_13d21a99-c3ab-4faf-bada-625e60c588f7', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 12:14:49 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574338483.9', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59249'}
19/11/21 12:14:49 INFO statecache.__init__: Creating state cache with size 0
19/11/21 12:14:49 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36999.
19/11/21 12:14:49 INFO sdk_worker.__init__: Control channel established.
19/11/21 12:14:49 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/21 12:14:49 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 12:14:49 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37647.
19/11/21 12:14:49 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 12:14:49 INFO data_plane.create_data_channel: Creating client data channel for localhost:34207
19/11/21 12:14:49 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 12:14:49 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 12:14:49 INFO sdk_worker.run: No more requests from control plane
19/11/21 12:14:49 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 12:14:49 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 12:14:49 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 12:14:49 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 12:14:49 INFO sdk_worker.run: Done consuming work.
19/11/21 12:14:49 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 12:14:49 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 12:14:49 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 12:14:49 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574338483.9_13d21a99-c3ab-4faf-bada-625e60c588f7 finished.
19/11/21 12:14:49 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/21 12:14:49 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktest9FCBxi/job_ac60db70-cbcc-499b-97de-f1b4c1f0f089/MANIFEST has 0 artifact locations
19/11/21 12:14:49 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktest9FCBxi/job_ac60db70-cbcc-499b-97de-f1b4c1f0f089/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139827549284096)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-116, started daemon 139827540891392)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 139828328515328)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139827514402560)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-122, started daemon 139827523057408)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)

# Thread: <_MainThread(MainThread, started 139828328515328)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(Thread-116, started daemon 139827540891392)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139827549284096)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574338474.7_7d8ac0e1-13ef-40b2-b589-3f967d67586c failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 322.775s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 7s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/5ksddri6sw3xi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1591

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1591/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 06:15:51 INFO sdk_worker.run: No more requests from control plane
19/11/21 06:15:51 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 06:15:51 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 06:15:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 06:15:51 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 06:15:51 INFO sdk_worker.run: Done consuming work.
19/11/21 06:15:51 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 06:15:51 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 06:15:52 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 06:15:52 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestaW4BgP/job_d4ba5733-1dd8-4da9-bb5f-050a6b8f0c94/MANIFEST
19/11/21 06:15:52 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestaW4BgP/job_d4ba5733-1dd8-4da9-bb5f-050a6b8f0c94/MANIFEST -> 0 artifacts
19/11/21 06:15:52 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 06:15:52 INFO sdk_worker_main.main: Logging handler created.
19/11/21 06:15:52 INFO sdk_worker_main.start: Status HTTP server running at localhost:37153
19/11/21 06:15:52 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 06:15:52 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 06:15:52 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574316950.21_9c696fcc-7a6a-4730-adaa-8d03700aeb0a', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 06:15:52 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574316950.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39629'}
19/11/21 06:15:52 INFO statecache.__init__: Creating state cache with size 0
19/11/21 06:15:52 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33779.
19/11/21 06:15:52 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/21 06:15:52 INFO sdk_worker.__init__: Control channel established.
19/11/21 06:15:52 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 06:15:52 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43539.
19/11/21 06:15:52 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 06:15:52 INFO data_plane.create_data_channel: Creating client data channel for localhost:40011
19/11/21 06:15:52 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 06:15:52 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 06:15:52 INFO sdk_worker.run: No more requests from control plane
19/11/21 06:15:52 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 06:15:52 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 06:15:52 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 06:15:52 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 06:15:52 INFO sdk_worker.run: Done consuming work.
19/11/21 06:15:52 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 06:15:52 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 06:15:52 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 06:15:53 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestaW4BgP/job_d4ba5733-1dd8-4da9-bb5f-050a6b8f0c94/MANIFEST
19/11/21 06:15:53 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestaW4BgP/job_d4ba5733-1dd8-4da9-bb5f-050a6b8f0c94/MANIFEST -> 0 artifacts
19/11/21 06:15:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 06:15:53 INFO sdk_worker_main.main: Logging handler created.
19/11/21 06:15:53 INFO sdk_worker_main.start: Status HTTP server running at localhost:45685
19/11/21 06:15:53 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 06:15:53 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 06:15:53 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574316950.21_9c696fcc-7a6a-4730-adaa-8d03700aeb0a', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 06:15:53 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574316950.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39629'}
19/11/21 06:15:53 INFO statecache.__init__: Creating state cache with size 0
19/11/21 06:15:53 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45779.
19/11/21 06:15:53 INFO sdk_worker.__init__: Control channel established.
19/11/21 06:15:53 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 06:15:53 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/21 06:15:53 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36581.
19/11/21 06:15:53 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 06:15:53 INFO data_plane.create_data_channel: Creating client data channel for localhost:38803
19/11/21 06:15:53 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 06:15:53 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 06:15:53 INFO sdk_worker.run: No more requests from control plane
19/11/21 06:15:53 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 06:15:53 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 06:15:53 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 06:15:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 06:15:53 INFO sdk_worker.run: Done consuming work.
19/11/21 06:15:53 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 06:15:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 06:15:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 06:15:53 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestaW4BgP/job_d4ba5733-1dd8-4da9-bb5f-050a6b8f0c94/MANIFEST
19/11/21 06:15:53 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestaW4BgP/job_d4ba5733-1dd8-4da9-bb5f-050a6b8f0c94/MANIFEST -> 0 artifacts
19/11/21 06:15:54 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 06:15:54 INFO sdk_worker_main.main: Logging handler created.
19/11/21 06:15:54 INFO sdk_worker_main.start: Status HTTP server running at localhost:46761
19/11/21 06:15:54 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 06:15:54 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 06:15:54 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574316950.21_9c696fcc-7a6a-4730-adaa-8d03700aeb0a', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 06:15:54 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574316950.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39629'}
19/11/21 06:15:54 INFO statecache.__init__: Creating state cache with size 0
19/11/21 06:15:54 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36803.
19/11/21 06:15:54 INFO sdk_worker.__init__: Control channel established.
19/11/21 06:15:54 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/21 06:15:54 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 06:15:54 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42087.
19/11/21 06:15:54 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 06:15:54 INFO data_plane.create_data_channel: Creating client data channel for localhost:43155
19/11/21 06:15:54 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 06:15:54 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 06:15:54 INFO sdk_worker.run: No more requests from control plane
19/11/21 06:15:54 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 06:15:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 06:15:54 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 06:15:54 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 06:15:54 INFO sdk_worker.run: Done consuming work.
19/11/21 06:15:54 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 06:15:54 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 06:15:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 06:15:54 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestaW4BgP/job_d4ba5733-1dd8-4da9-bb5f-050a6b8f0c94/MANIFEST
19/11/21 06:15:54 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestaW4BgP/job_d4ba5733-1dd8-4da9-bb5f-050a6b8f0c94/MANIFEST -> 0 artifacts
19/11/21 06:15:55 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 06:15:55 INFO sdk_worker_main.main: Logging handler created.
19/11/21 06:15:55 INFO sdk_worker_main.start: Status HTTP server running at localhost:43469
19/11/21 06:15:55 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 06:15:55 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 06:15:55 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574316950.21_9c696fcc-7a6a-4730-adaa-8d03700aeb0a', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 06:15:55 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574316950.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39629'}
19/11/21 06:15:55 INFO statecache.__init__: Creating state cache with size 0
19/11/21 06:15:55 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40307.
19/11/21 06:15:55 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/21 06:15:55 INFO sdk_worker.__init__: Control channel established.
19/11/21 06:15:55 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 06:15:55 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43197.
19/11/21 06:15:55 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 06:15:55 INFO data_plane.create_data_channel: Creating client data channel for localhost:46561
19/11/21 06:15:55 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 06:15:55 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 06:15:55 INFO sdk_worker.run: No more requests from control plane
19/11/21 06:15:55 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 06:15:55 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 06:15:55 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 06:15:55 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 06:15:55 INFO sdk_worker.run: Done consuming work.
19/11/21 06:15:55 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 06:15:55 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 06:15:55 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 06:15:55 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574316950.21_9c696fcc-7a6a-4730-adaa-8d03700aeb0a finished.
19/11/21 06:15:55 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/21 06:15:55 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestaW4BgP/job_d4ba5733-1dd8-4da9-bb5f-050a6b8f0c94/MANIFEST has 0 artifact locations
19/11/21 06:15:55 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestaW4BgP/job_d4ba5733-1dd8-4da9-bb5f-050a6b8f0c94/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
==================== Timed out after 60 seconds. ====================
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 140133969946368)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-120, started daemon 140133953160960)>

# Thread: <_MainThread(MainThread, started 140134956226304)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140133936375552)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(Thread-126, started daemon 140133944768256)>

# Thread: <Thread(Thread-120, started daemon 140133953160960)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 140133969946368)>

# Thread: <_MainThread(MainThread, started 140134956226304)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574316940.86_b7b83b6b-ae12-49c1-9e12-97c9d4fc1f36 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 303.888s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 34s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/qnpsvqk473d4s

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1590

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1590/display/redirect?page=changes>

Changes:

[github] common --> unique


------------------------------------------
[...truncated 1.32 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 05:08:32 INFO sdk_worker.run: No more requests from control plane
19/11/21 05:08:32 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 05:08:32 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 05:08:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 05:08:32 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 05:08:32 INFO sdk_worker.run: Done consuming work.
19/11/21 05:08:32 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 05:08:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 05:08:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 05:08:33 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestfByBKc/job_69d59b5c-2b5c-4c51-a3ef-6c1c510edf53/MANIFEST
19/11/21 05:08:33 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestfByBKc/job_69d59b5c-2b5c-4c51-a3ef-6c1c510edf53/MANIFEST -> 0 artifacts
19/11/21 05:08:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 05:08:33 INFO sdk_worker_main.main: Logging handler created.
19/11/21 05:08:33 INFO sdk_worker_main.start: Status HTTP server running at localhost:38691
19/11/21 05:08:33 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 05:08:33 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 05:08:33 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574312911.26_f8e2e01e-2ba7-457f-ad64-83291aab3f64', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 05:08:33 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574312911.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56535'}
19/11/21 05:08:33 INFO statecache.__init__: Creating state cache with size 0
19/11/21 05:08:33 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34349.
19/11/21 05:08:33 INFO sdk_worker.__init__: Control channel established.
19/11/21 05:08:33 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 05:08:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/21 05:08:33 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38501.
19/11/21 05:08:33 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 05:08:33 INFO data_plane.create_data_channel: Creating client data channel for localhost:45887
19/11/21 05:08:33 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 05:08:33 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 05:08:33 INFO sdk_worker.run: No more requests from control plane
19/11/21 05:08:33 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 05:08:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 05:08:33 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 05:08:33 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 05:08:33 INFO sdk_worker.run: Done consuming work.
19/11/21 05:08:33 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 05:08:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 05:08:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 05:08:33 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestfByBKc/job_69d59b5c-2b5c-4c51-a3ef-6c1c510edf53/MANIFEST
19/11/21 05:08:33 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestfByBKc/job_69d59b5c-2b5c-4c51-a3ef-6c1c510edf53/MANIFEST -> 0 artifacts
19/11/21 05:08:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 05:08:34 INFO sdk_worker_main.main: Logging handler created.
19/11/21 05:08:34 INFO sdk_worker_main.start: Status HTTP server running at localhost:45013
19/11/21 05:08:34 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 05:08:34 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 05:08:34 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574312911.26_f8e2e01e-2ba7-457f-ad64-83291aab3f64', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 05:08:34 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574312911.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56535'}
19/11/21 05:08:34 INFO statecache.__init__: Creating state cache with size 0
19/11/21 05:08:34 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36615.
19/11/21 05:08:34 INFO sdk_worker.__init__: Control channel established.
19/11/21 05:08:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/21 05:08:34 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 05:08:34 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35599.
19/11/21 05:08:34 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 05:08:34 INFO data_plane.create_data_channel: Creating client data channel for localhost:41477
19/11/21 05:08:34 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 05:08:34 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 05:08:34 INFO sdk_worker.run: No more requests from control plane
19/11/21 05:08:34 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 05:08:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 05:08:34 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 05:08:34 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 05:08:34 INFO sdk_worker.run: Done consuming work.
19/11/21 05:08:34 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 05:08:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 05:08:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 05:08:34 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestfByBKc/job_69d59b5c-2b5c-4c51-a3ef-6c1c510edf53/MANIFEST
19/11/21 05:08:34 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestfByBKc/job_69d59b5c-2b5c-4c51-a3ef-6c1c510edf53/MANIFEST -> 0 artifacts
19/11/21 05:08:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 05:08:35 INFO sdk_worker_main.main: Logging handler created.
19/11/21 05:08:35 INFO sdk_worker_main.start: Status HTTP server running at localhost:40777
19/11/21 05:08:35 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 05:08:35 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 05:08:35 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574312911.26_f8e2e01e-2ba7-457f-ad64-83291aab3f64', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 05:08:35 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574312911.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56535'}
19/11/21 05:08:35 INFO statecache.__init__: Creating state cache with size 0
19/11/21 05:08:35 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37557.
19/11/21 05:08:35 INFO sdk_worker.__init__: Control channel established.
19/11/21 05:08:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/21 05:08:35 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 05:08:35 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38763.
19/11/21 05:08:35 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 05:08:35 INFO data_plane.create_data_channel: Creating client data channel for localhost:33531
19/11/21 05:08:35 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 05:08:35 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 05:08:35 INFO sdk_worker.run: No more requests from control plane
19/11/21 05:08:35 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 05:08:35 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 05:08:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 05:08:35 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 05:08:35 INFO sdk_worker.run: Done consuming work.
19/11/21 05:08:35 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 05:08:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 05:08:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 05:08:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestfByBKc/job_69d59b5c-2b5c-4c51-a3ef-6c1c510edf53/MANIFEST
19/11/21 05:08:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestfByBKc/job_69d59b5c-2b5c-4c51-a3ef-6c1c510edf53/MANIFEST -> 0 artifacts
19/11/21 05:08:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 05:08:36 INFO sdk_worker_main.main: Logging handler created.
19/11/21 05:08:36 INFO sdk_worker_main.start: Status HTTP server running at localhost:42171
19/11/21 05:08:36 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 05:08:36 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 05:08:36 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574312911.26_f8e2e01e-2ba7-457f-ad64-83291aab3f64', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 05:08:36 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574312911.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56535'}
19/11/21 05:08:36 INFO statecache.__init__: Creating state cache with size 0
19/11/21 05:08:36 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43489.
19/11/21 05:08:36 INFO sdk_worker.__init__: Control channel established.
19/11/21 05:08:36 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 05:08:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/21 05:08:36 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43271.
19/11/21 05:08:36 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 05:08:36 INFO data_plane.create_data_channel: Creating client data channel for localhost:44019
19/11/21 05:08:36 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 05:08:36 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 05:08:36 INFO sdk_worker.run: No more requests from control plane
19/11/21 05:08:36 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 05:08:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 05:08:36 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 05:08:36 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 05:08:36 INFO sdk_worker.run: Done consuming work.
19/11/21 05:08:36 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 05:08:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 05:08:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 05:08:36 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574312911.26_f8e2e01e-2ba7-457f-ad64-83291aab3f64 finished.
19/11/21 05:08:36 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/21 05:08:36 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestfByBKc/job_69d59b5c-2b5c-4c51-a3ef-6c1c510edf53/MANIFEST has 0 artifact locations
19/11/21 05:08:36 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestfByBKc/job_69d59b5c-2b5c-4c51-a3ef-6c1c510edf53/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
==================== Timed out after 60 seconds. ====================
    _common.wait(self._state.condition.wait, _response_ready)

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 140718936925952)>

# Thread: <Thread(Thread-119, started daemon 140718953711360)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <_MainThread(MainThread, started 140719732942592)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
# Thread: <Thread(wait_until_finish_read, started daemon 140718919091968)>

BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-125, started daemon 140718927746816)>

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
# Thread: <Thread(Thread-119, started daemon 140718953711360)>

----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140718936925952)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 32# Thread: <_MainThread(MainThread, started 140719732942592)>
8, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574312901.45_e5e32260-4a96-4c97-8072-5151cf1d0777 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 305.180s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 31s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/nyxlne5ubh3vk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1589

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1589/display/redirect?page=changes>

Changes:

[M.Yzvenn] fix - 1MB is interpreted as 1000, not 1024

[valentyn] Guard pickling operations with a lock to prevent race condition in


------------------------------------------
[...truncated 1.32 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 01:16:27 INFO sdk_worker.run: No more requests from control plane
19/11/21 01:16:27 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 01:16:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 01:16:28 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 01:16:28 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 01:16:28 INFO sdk_worker.run: Done consuming work.
19/11/21 01:16:28 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 01:16:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 01:16:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 01:16:28 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest9IUc2c/job_3799dc55-c093-4396-8edf-f3d2c0ac616e/MANIFEST
19/11/21 01:16:28 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest9IUc2c/job_3799dc55-c093-4396-8edf-f3d2c0ac616e/MANIFEST -> 0 artifacts
19/11/21 01:16:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 01:16:28 INFO sdk_worker_main.main: Logging handler created.
19/11/21 01:16:28 INFO sdk_worker_main.start: Status HTTP server running at localhost:43499
19/11/21 01:16:28 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 01:16:28 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 01:16:28 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574298986.2_ccb5ce66-0d19-43e6-a90f-3992fc030ff2', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 01:16:28 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574298986.2', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44491'}
19/11/21 01:16:28 INFO statecache.__init__: Creating state cache with size 0
19/11/21 01:16:28 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40747.
19/11/21 01:16:28 INFO sdk_worker.__init__: Control channel established.
19/11/21 01:16:28 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 01:16:28 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/21 01:16:28 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44029.
19/11/21 01:16:28 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 01:16:28 INFO data_plane.create_data_channel: Creating client data channel for localhost:32891
19/11/21 01:16:28 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 01:16:28 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 01:16:28 INFO sdk_worker.run: No more requests from control plane
19/11/21 01:16:28 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 01:16:28 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 01:16:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 01:16:28 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 01:16:28 INFO sdk_worker.run: Done consuming work.
19/11/21 01:16:28 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 01:16:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 01:16:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 01:16:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest9IUc2c/job_3799dc55-c093-4396-8edf-f3d2c0ac616e/MANIFEST
19/11/21 01:16:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest9IUc2c/job_3799dc55-c093-4396-8edf-f3d2c0ac616e/MANIFEST -> 0 artifacts
19/11/21 01:16:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 01:16:29 INFO sdk_worker_main.main: Logging handler created.
19/11/21 01:16:29 INFO sdk_worker_main.start: Status HTTP server running at localhost:46401
19/11/21 01:16:29 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 01:16:29 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 01:16:29 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574298986.2_ccb5ce66-0d19-43e6-a90f-3992fc030ff2', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 01:16:29 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574298986.2', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44491'}
19/11/21 01:16:29 INFO statecache.__init__: Creating state cache with size 0
19/11/21 01:16:29 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34067.
19/11/21 01:16:29 INFO sdk_worker.__init__: Control channel established.
19/11/21 01:16:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/21 01:16:29 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 01:16:29 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37959.
19/11/21 01:16:29 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 01:16:29 INFO data_plane.create_data_channel: Creating client data channel for localhost:34387
19/11/21 01:16:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 01:16:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 01:16:29 INFO sdk_worker.run: No more requests from control plane
19/11/21 01:16:29 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 01:16:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 01:16:29 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 01:16:29 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 01:16:29 INFO sdk_worker.run: Done consuming work.
19/11/21 01:16:29 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 01:16:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 01:16:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 01:16:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest9IUc2c/job_3799dc55-c093-4396-8edf-f3d2c0ac616e/MANIFEST
19/11/21 01:16:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest9IUc2c/job_3799dc55-c093-4396-8edf-f3d2c0ac616e/MANIFEST -> 0 artifacts
19/11/21 01:16:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 01:16:30 INFO sdk_worker_main.main: Logging handler created.
19/11/21 01:16:30 INFO sdk_worker_main.start: Status HTTP server running at localhost:35267
19/11/21 01:16:30 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 01:16:30 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 01:16:30 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574298986.2_ccb5ce66-0d19-43e6-a90f-3992fc030ff2', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 01:16:30 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574298986.2', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44491'}
19/11/21 01:16:30 INFO statecache.__init__: Creating state cache with size 0
19/11/21 01:16:30 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41677.
19/11/21 01:16:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/21 01:16:30 INFO sdk_worker.__init__: Control channel established.
19/11/21 01:16:30 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 01:16:30 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36341.
19/11/21 01:16:30 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 01:16:30 INFO data_plane.create_data_channel: Creating client data channel for localhost:45093
19/11/21 01:16:30 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 01:16:30 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 01:16:30 INFO sdk_worker.run: No more requests from control plane
19/11/21 01:16:30 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 01:16:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 01:16:30 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 01:16:30 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 01:16:30 INFO sdk_worker.run: Done consuming work.
19/11/21 01:16:30 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 01:16:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 01:16:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 01:16:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest9IUc2c/job_3799dc55-c093-4396-8edf-f3d2c0ac616e/MANIFEST
19/11/21 01:16:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest9IUc2c/job_3799dc55-c093-4396-8edf-f3d2c0ac616e/MANIFEST -> 0 artifacts
19/11/21 01:16:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 01:16:31 INFO sdk_worker_main.main: Logging handler created.
19/11/21 01:16:31 INFO sdk_worker_main.start: Status HTTP server running at localhost:35259
19/11/21 01:16:31 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 01:16:31 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 01:16:31 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574298986.2_ccb5ce66-0d19-43e6-a90f-3992fc030ff2', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 01:16:31 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574298986.2', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44491'}
19/11/21 01:16:31 INFO statecache.__init__: Creating state cache with size 0
19/11/21 01:16:31 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41279.
19/11/21 01:16:31 INFO sdk_worker.__init__: Control channel established.
19/11/21 01:16:31 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 01:16:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/21 01:16:31 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36935.
19/11/21 01:16:31 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 01:16:31 INFO data_plane.create_data_channel: Creating client data channel for localhost:40439
19/11/21 01:16:31 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 01:16:31 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 01:16:31 INFO sdk_worker.run: No more requests from control plane
19/11/21 01:16:31 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 01:16:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 01:16:31 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 01:16:31 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 01:16:31 INFO sdk_worker.run: Done consuming work.
19/11/21 01:16:31 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 01:16:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 01:16:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 01:16:31 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574298986.2_ccb5ce66-0d19-43e6-a90f-3992fc030ff2 finished.
19/11/21 01:16:31 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/21 01:16:31 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktest9IUc2c/job_3799dc55-c093-4396-8edf-f3d2c0ac616e/MANIFEST has 0 artifact locations
19/11/21 01:16:31 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktest9IUc2c/job_3799dc55-c093-4396-8edf-f3d2c0ac616e/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
==================== Timed out after 60 seconds. ====================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__

# Thread: <Thread(wait_until_finish_read, started daemon 140449774970624)>

    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
# Thread: <Thread(Thread-119, started daemon 140449758185216)>

    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 140450554201856)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 140449739826944)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-125, started daemon 140449748481792)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <_MainThread(MainThread, started 140450554201856)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-119, started daemon 140449758185216)>

======================================================================
# Thread: <Thread(wait_until_finish_read, started daemon 140449774970624)>
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574298977.02_b4cdfa52-888f-45a7-a305-ae87594e6cbf failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 311.079s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 29s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/pp5senpner4mk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1588

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1588/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 00:20:59 INFO sdk_worker.run: No more requests from control plane
19/11/21 00:20:59 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 00:20:59 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 00:20:59 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 00:20:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 00:20:59 INFO sdk_worker.run: Done consuming work.
19/11/21 00:20:59 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 00:20:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 00:20:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 00:20:59 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestNWEkTf/job_fcaf75a8-3f51-4bb3-88bb-18828cb24b04/MANIFEST
19/11/21 00:20:59 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestNWEkTf/job_fcaf75a8-3f51-4bb3-88bb-18828cb24b04/MANIFEST -> 0 artifacts
19/11/21 00:21:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 00:21:00 INFO sdk_worker_main.main: Logging handler created.
19/11/21 00:21:00 INFO sdk_worker_main.start: Status HTTP server running at localhost:38931
19/11/21 00:21:00 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 00:21:00 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 00:21:00 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574295658.03_c20db409-cc81-4e4e-ad97-88b79958f8d3', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 00:21:00 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574295658.03', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46787'}
19/11/21 00:21:00 INFO statecache.__init__: Creating state cache with size 0
19/11/21 00:21:00 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45467.
19/11/21 00:21:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/21 00:21:00 INFO sdk_worker.__init__: Control channel established.
19/11/21 00:21:00 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 00:21:00 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:32857.
19/11/21 00:21:00 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 00:21:00 INFO data_plane.create_data_channel: Creating client data channel for localhost:40729
19/11/21 00:21:00 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 00:21:00 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 00:21:00 INFO sdk_worker.run: No more requests from control plane
19/11/21 00:21:00 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 00:21:00 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 00:21:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 00:21:00 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 00:21:00 INFO sdk_worker.run: Done consuming work.
19/11/21 00:21:00 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 00:21:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 00:21:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 00:21:00 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestNWEkTf/job_fcaf75a8-3f51-4bb3-88bb-18828cb24b04/MANIFEST
19/11/21 00:21:00 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestNWEkTf/job_fcaf75a8-3f51-4bb3-88bb-18828cb24b04/MANIFEST -> 0 artifacts
19/11/21 00:21:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 00:21:01 INFO sdk_worker_main.main: Logging handler created.
19/11/21 00:21:01 INFO sdk_worker_main.start: Status HTTP server running at localhost:41945
19/11/21 00:21:01 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 00:21:01 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 00:21:01 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574295658.03_c20db409-cc81-4e4e-ad97-88b79958f8d3', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 00:21:01 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574295658.03', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46787'}
19/11/21 00:21:01 INFO statecache.__init__: Creating state cache with size 0
19/11/21 00:21:01 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37575.
19/11/21 00:21:01 INFO sdk_worker.__init__: Control channel established.
19/11/21 00:21:01 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 00:21:01 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/21 00:21:01 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39567.
19/11/21 00:21:01 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 00:21:01 INFO data_plane.create_data_channel: Creating client data channel for localhost:39233
19/11/21 00:21:01 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 00:21:01 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 00:21:01 INFO sdk_worker.run: No more requests from control plane
19/11/21 00:21:01 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 00:21:01 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 00:21:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 00:21:01 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 00:21:01 INFO sdk_worker.run: Done consuming work.
19/11/21 00:21:01 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 00:21:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 00:21:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 00:21:01 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestNWEkTf/job_fcaf75a8-3f51-4bb3-88bb-18828cb24b04/MANIFEST
19/11/21 00:21:01 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestNWEkTf/job_fcaf75a8-3f51-4bb3-88bb-18828cb24b04/MANIFEST -> 0 artifacts
19/11/21 00:21:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 00:21:02 INFO sdk_worker_main.main: Logging handler created.
19/11/21 00:21:02 INFO sdk_worker_main.start: Status HTTP server running at localhost:45015
19/11/21 00:21:02 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 00:21:02 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 00:21:02 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574295658.03_c20db409-cc81-4e4e-ad97-88b79958f8d3', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 00:21:02 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574295658.03', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46787'}
19/11/21 00:21:02 INFO statecache.__init__: Creating state cache with size 0
19/11/21 00:21:02 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38797.
19/11/21 00:21:02 INFO sdk_worker.__init__: Control channel established.
19/11/21 00:21:02 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 00:21:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/21 00:21:02 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44377.
19/11/21 00:21:02 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 00:21:02 INFO data_plane.create_data_channel: Creating client data channel for localhost:45599
19/11/21 00:21:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 00:21:02 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 00:21:02 INFO sdk_worker.run: No more requests from control plane
19/11/21 00:21:02 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 00:21:02 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 00:21:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 00:21:02 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 00:21:02 INFO sdk_worker.run: Done consuming work.
19/11/21 00:21:02 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 00:21:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 00:21:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 00:21:02 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestNWEkTf/job_fcaf75a8-3f51-4bb3-88bb-18828cb24b04/MANIFEST
19/11/21 00:21:02 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestNWEkTf/job_fcaf75a8-3f51-4bb3-88bb-18828cb24b04/MANIFEST -> 0 artifacts
19/11/21 00:21:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/21 00:21:02 INFO sdk_worker_main.main: Logging handler created.
19/11/21 00:21:02 INFO sdk_worker_main.start: Status HTTP server running at localhost:37291
19/11/21 00:21:02 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/21 00:21:02 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/21 00:21:02 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574295658.03_c20db409-cc81-4e4e-ad97-88b79958f8d3', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/21 00:21:02 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574295658.03', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46787'}
19/11/21 00:21:02 INFO statecache.__init__: Creating state cache with size 0
19/11/21 00:21:02 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34539.
19/11/21 00:21:02 INFO sdk_worker.__init__: Control channel established.
19/11/21 00:21:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/21 00:21:02 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/21 00:21:02 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33379.
19/11/21 00:21:02 INFO sdk_worker.create_state_handler: State channel established.
19/11/21 00:21:03 INFO data_plane.create_data_channel: Creating client data channel for localhost:36781
19/11/21 00:21:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/21 00:21:03 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/21 00:21:03 INFO sdk_worker.run: No more requests from control plane
19/11/21 00:21:03 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/21 00:21:03 INFO data_plane.close: Closing all cached grpc data channels.
19/11/21 00:21:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 00:21:03 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/21 00:21:03 INFO sdk_worker.run: Done consuming work.
19/11/21 00:21:03 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/21 00:21:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/21 00:21:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/21 00:21:03 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574295658.03_c20db409-cc81-4e4e-ad97-88b79958f8d3 finished.
19/11/21 00:21:03 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/21 00:21:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestNWEkTf/job_fcaf75a8-3f51-4bb3-88bb-18828cb24b04/MANIFEST has 0 artifact locations
19/11/21 00:21:03 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestNWEkTf/job_fcaf75a8-3f51-4bb3-88bb-18828cb24b04/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
==================== Timed out after 60 seconds. ====================
    _common.wait(self._state.condition.wait, _response_ready)

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 140138658379520)>

# Thread: <Thread(Thread-118, started daemon 140138649986816)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <_MainThread(MainThread, started 140139445872384)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(lis# Thread: <Thread(wait_until_finish_read, started daemon 140138640545536)>

t(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(Thread-124, started daemon 140138632152832)>

  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(Thread-118, started daemon 140138649986816)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574295649.15_9b67a3d2-f490-43fd-98c3-8645d689a653 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <Thread(wait_until_finish_read, started daemon 140138658379520)>

----------------------------------------------------------------------
Ran 38 tests in 307.281s

# Thread: <_MainThread(MainThread, started 140139445872384)>
FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 49s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/w2eklvkpffu5a

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1587

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1587/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-8795] fix Spark runner build


------------------------------------------
[...truncated 1.32 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 23:44:32 INFO sdk_worker.run: No more requests from control plane
19/11/20 23:44:32 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 23:44:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 23:44:32 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 23:44:32 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 23:44:32 INFO sdk_worker.run: Done consuming work.
19/11/20 23:44:32 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 23:44:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 23:44:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 23:44:32 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestKTiOUG/job_bb8ede11-a001-46b7-96d5-4312c142b4cf/MANIFEST
19/11/20 23:44:32 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestKTiOUG/job_bb8ede11-a001-46b7-96d5-4312c142b4cf/MANIFEST -> 0 artifacts
19/11/20 23:44:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 23:44:33 INFO sdk_worker_main.main: Logging handler created.
19/11/20 23:44:33 INFO sdk_worker_main.start: Status HTTP server running at localhost:36507
19/11/20 23:44:33 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/20 23:44:33 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 23:44:33 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574293471.06_4a0b6467-bd10-4d95-b515-ff8d1c520d57', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 23:44:33 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574293471.06', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58751'}
19/11/20 23:44:33 INFO statecache.__init__: Creating state cache with size 0
19/11/20 23:44:33 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36911.
19/11/20 23:44:33 INFO sdk_worker.__init__: Control channel established.
19/11/20 23:44:33 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 23:44:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/20 23:44:33 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42523.
19/11/20 23:44:33 INFO sdk_worker.create_state_handler: State channel established.
19/11/20 23:44:33 INFO data_plane.create_data_channel: Creating client data channel for localhost:43957
19/11/20 23:44:33 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/20 23:44:33 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 23:44:33 INFO sdk_worker.run: No more requests from control plane
19/11/20 23:44:33 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 23:44:33 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 23:44:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 23:44:33 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 23:44:33 INFO sdk_worker.run: Done consuming work.
19/11/20 23:44:33 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 23:44:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 23:44:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 23:44:33 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestKTiOUG/job_bb8ede11-a001-46b7-96d5-4312c142b4cf/MANIFEST
19/11/20 23:44:33 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestKTiOUG/job_bb8ede11-a001-46b7-96d5-4312c142b4cf/MANIFEST -> 0 artifacts
19/11/20 23:44:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 23:44:34 INFO sdk_worker_main.main: Logging handler created.
19/11/20 23:44:34 INFO sdk_worker_main.start: Status HTTP server running at localhost:36719
19/11/20 23:44:34 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/20 23:44:34 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 23:44:34 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574293471.06_4a0b6467-bd10-4d95-b515-ff8d1c520d57', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 23:44:34 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574293471.06', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58751'}
19/11/20 23:44:34 INFO statecache.__init__: Creating state cache with size 0
19/11/20 23:44:34 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42563.
19/11/20 23:44:34 INFO sdk_worker.__init__: Control channel established.
19/11/20 23:44:34 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 23:44:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/20 23:44:34 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33063.
19/11/20 23:44:34 INFO sdk_worker.create_state_handler: State channel established.
19/11/20 23:44:34 INFO data_plane.create_data_channel: Creating client data channel for localhost:45065
19/11/20 23:44:34 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/20 23:44:34 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 23:44:34 INFO sdk_worker.run: No more requests from control plane
19/11/20 23:44:34 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 23:44:34 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 23:44:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 23:44:34 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 23:44:34 INFO sdk_worker.run: Done consuming work.
19/11/20 23:44:34 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 23:44:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 23:44:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 23:44:34 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestKTiOUG/job_bb8ede11-a001-46b7-96d5-4312c142b4cf/MANIFEST
19/11/20 23:44:34 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestKTiOUG/job_bb8ede11-a001-46b7-96d5-4312c142b4cf/MANIFEST -> 0 artifacts
19/11/20 23:44:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 23:44:35 INFO sdk_worker_main.main: Logging handler created.
19/11/20 23:44:35 INFO sdk_worker_main.start: Status HTTP server running at localhost:39245
19/11/20 23:44:35 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/20 23:44:35 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 23:44:35 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574293471.06_4a0b6467-bd10-4d95-b515-ff8d1c520d57', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 23:44:35 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574293471.06', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58751'}
19/11/20 23:44:35 INFO statecache.__init__: Creating state cache with size 0
19/11/20 23:44:35 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40223.
19/11/20 23:44:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/20 23:44:35 INFO sdk_worker.__init__: Control channel established.
19/11/20 23:44:35 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 23:44:35 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44673.
19/11/20 23:44:35 INFO sdk_worker.create_state_handler: State channel established.
19/11/20 23:44:35 INFO data_plane.create_data_channel: Creating client data channel for localhost:44013
19/11/20 23:44:35 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/20 23:44:35 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 23:44:35 INFO sdk_worker.run: No more requests from control plane
19/11/20 23:44:35 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 23:44:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 23:44:35 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 23:44:35 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 23:44:35 INFO sdk_worker.run: Done consuming work.
19/11/20 23:44:35 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 23:44:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 23:44:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 23:44:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestKTiOUG/job_bb8ede11-a001-46b7-96d5-4312c142b4cf/MANIFEST
19/11/20 23:44:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestKTiOUG/job_bb8ede11-a001-46b7-96d5-4312c142b4cf/MANIFEST -> 0 artifacts
19/11/20 23:44:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 23:44:36 INFO sdk_worker_main.main: Logging handler created.
19/11/20 23:44:36 INFO sdk_worker_main.start: Status HTTP server running at localhost:37289
19/11/20 23:44:36 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/20 23:44:36 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 23:44:36 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574293471.06_4a0b6467-bd10-4d95-b515-ff8d1c520d57', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 23:44:36 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574293471.06', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58751'}
19/11/20 23:44:36 INFO statecache.__init__: Creating state cache with size 0
19/11/20 23:44:36 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37007.
19/11/20 23:44:36 INFO sdk_worker.__init__: Control channel established.
19/11/20 23:44:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/20 23:44:36 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 23:44:36 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36411.
19/11/20 23:44:36 INFO sdk_worker.create_state_handler: State channel established.
19/11/20 23:44:36 INFO data_plane.create_data_channel: Creating client data channel for localhost:41649
19/11/20 23:44:36 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/20 23:44:36 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 23:44:36 INFO sdk_worker.run: No more requests from control plane
19/11/20 23:44:36 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 23:44:36 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 23:44:36 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 23:44:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 23:44:36 INFO sdk_worker.run: Done consuming work.
19/11/20 23:44:36 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 23:44:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 23:44:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 23:44:36 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574293471.06_4a0b6467-bd10-4d95-b515-ff8d1c520d57 finished.
19/11/20 23:44:36 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/20 23:44:36 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestKTiOUG/job_bb8ede11-a001-46b7-96d5-4312c142b4cf/MANIFEST has 0 artifact locations
19/11/20 23:44:36 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestKTiOUG/job_bb8ede11-a001-46b7-96d5-4312c142b4cf/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140288169330432)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(Thread-119, started daemon 140288177723136)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140289173010176)>
======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
==================== Timed out after 60 seconds. ====================

----------------------------------------------------------------------
# Thread: <Thread(wait_until_finish_read, started daemon 140288160937728)>

Traceback (most recent call last):
# Thread: <Thread(Thread-125, started daemon 140288152545024)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
# Thread: <_MainThread(MainThread, started 140289173010176)>

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(Thread-119, started daemon 140288177723136)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 140288169330432)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574293461.74_12bb59fb-97e1-4a50-bb1b-3a4c89af4ad5 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 320.175s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 29s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://gradle.com/s/ia4i63f4hrz6w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1586

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1586/display/redirect?page=changes>

Changes:

[robertwb] [BEAM-8629] Don't return mutable class type hints.


------------------------------------------
[...truncated 28.35 KB...]
Resolving github.com/ianlancetaylor/demangle: commit='4883227f66371e02c4948937d3e2be1664d9be38', urls=[https://github.com/ianlancetaylor/demangle.git, git@github.com:ianlancetaylor/demangle.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/kr/fs: commit='2788f0dbd16903de03cb8186e5c7d97b69ad387b', urls=[https://github.com/kr/fs.git, git@github.com:kr/fs.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/magiconair/properties: commit='49d762b9817ba1c2e9d0c69183c2b4a8b8f1d934', urls=[https://github.com/magiconair/properties.git, git@github.com:magiconair/properties.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/mitchellh/go-homedir: commit='b8bc1bf767474819792c23f32d8286a45736f1c6', urls=[https://github.com/mitchellh/go-homedir.git, git@github.com:mitchellh/go-homedir.git]
Resolving github.com/mitchellh/mapstructure: commit='a4e142e9c047c904fa2f1e144d9a84e6133024bc', urls=[https://github.com/mitchellh/mapstructure.git, git@github.com:mitchellh/mapstructure.git]
Resolving github.com/nightlyone/lockfile: commit='0ad87eef1443f64d3d8c50da647e2b1552851124', urls=[https://github.com/nightlyone/lockfile, git@github.com:nightlyone/lockfile.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/openzipkin/zipkin-go: commit='3741243b287094fda649c7f0fa74bd51f37dc122', urls=[https://github.com/openzipkin/zipkin-go.git, git@github.com:openzipkin/zipkin-go.git]
Resolving github.com/pelletier/go-toml: commit='acdc4509485b587f5e675510c4f2c63e90ff68a8', urls=[https://github.com/pelletier/go-toml.git, git@github.com:pelletier/go-toml.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/pierrec/lz4: commit='ed8d4cc3b461464e69798080a0092bd028910298', urls=[https://github.com/pierrec/lz4.git, git@github.com:pierrec/lz4.git]
Resolving github.com/pierrec/xxHash: commit='a0006b13c722f7f12368c00a3d3c2ae8a999a0c6', urls=[https://github.com/pierrec/xxHash.git, git@github.com:pierrec/xxHash.git]
Resolving github.com/pkg/errors: commit='30136e27e2ac8d167177e8a583aa4c3fea5be833', urls=[https://github.com/pkg/errors.git, git@github.com:pkg/errors.git]
Resolving github.com/pkg/sftp: commit='22e9c1ccc02fc1b9fa3264572e49109b68a86947', urls=[https://github.com/pkg/sftp.git, git@github.com:pkg/sftp.git]
Resolving github.com/prometheus/client_golang: commit='9bb6ab929dcbe1c8393cd9ef70387cb69811bd1c', urls=[https://github.com/prometheus/client_golang.git, git@github.com:prometheus/client_golang.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/prometheus/procfs: commit='cb4147076ac75738c9a7d279075a253c0cc5acbd', urls=[https://github.com/prometheus/procfs.git, git@github.com:prometheus/procfs.git]
Resolving github.com/rcrowley/go-metrics: commit='8732c616f52954686704c8645fe1a9d59e9df7c1', urls=[https://github.com/rcrowley/go-metrics.git, git@github.com:rcrowley/go-metrics.git]
Resolving github.com/cpuguy83/go-md2man: commit='dc9f53734905c233adfc09fd4f063dce63ce3daf', urls=[https://github.com/cpuguy83/go-md2man.git, git@github.com:cpuguy83/go-md2man.git]
Resolving cached github.com/cpuguy83/go-md2man: commit='dc9f53734905c233adfc09fd4f063dce63ce3daf', urls=[https://github.com/cpuguy83/go-md2man.git, git@github.com:cpuguy83/go-md2man.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/spf13/afero: commit='bb8f1927f2a9d3ab41c9340aa034f6b803f4359c', urls=[https://github.com/spf13/afero.git, git@github.com:spf13/afero.git]

> Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar
> Task :runners:core-construction-java:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar

> Task :sdks:python:test-suites:portable:py2:setupVirtualenv
Successfully installed configparser-4.0.2 contextlib2-0.6.0.post1 enum34-1.1.6 filelock-3.0.12 futures-3.3.0 grpcio-1.25.0 grpcio-tools-1.3.5 importlib-metadata-0.23 more-itertools-5.0.0 pathlib2-2.3.5 pluggy-0.13.0 protobuf-3.10.0 py-1.8.0 scandir-1.10.0 six-1.13.0 toml-0.10.0 tox-3.11.1 virtualenv-16.7.7 zipp-0.6.0

> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar

> Task :sdks:go:resolveBuildDependencies
Resolving github.com/spf13/cast: commit='acbeb36b902d72a7a4c18e8f3241075e7ab763e4', urls=[https://github.com/spf13/cast.git, git@github.com:spf13/cast.git]
Resolving github.com/spf13/cobra: commit='93959269ad99e80983c9ba742a7e01203a4c0e4f', urls=[https://github.com/spf13/cobra.git, git@github.com:spf13/cobra.git]
Resolving github.com/spf13/jwalterweatherman: commit='7c0cea34c8ece3fbeb2b27ab9b59511d360fb394', urls=[https://github.com/spf13/jwalterweatherman.git, git@github.com:spf13/jwalterweatherman.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/spf13/viper: commit='aafc9e6bc7b7bb53ddaa75a5ef49a17d6e654be5', urls=[https://github.com/spf13/viper.git, git@github.com:spf13/viper.git]
Resolving github.com/stathat/go: commit='74669b9f388d9d788c97399a0824adbfee78400e', urls=[https://github.com/stathat/go.git, git@github.com:stathat/go.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/xordataexchange/crypt: commit='b2862e3d0a775f18c7cfe02273500ae307b61218', urls=[https://github.com/xordataexchange/crypt.git, git@github.com:xordataexchange/crypt.git]
Resolving go.opencensus.io: commit='aa2b39d1618ef56ba156f27cfcdae9042f68f0bc', urls=[https://github.com/census-instrumentation/opencensus-go]
Resolving golang.org/x/crypto: commit='d9133f5469342136e669e85192a26056b587f503', urls=[https://go.googlesource.com/crypto]
Resolving golang.org/x/debug: commit='95515998a8a4bd7448134b2cb5971dbeb12e0b77', urls=[https://go.googlesource.com/debug]
Resolving golang.org/x/net: commit='2fb46b16b8dda405028c50f7c7f0f9dd1fa6bfb1', urls=[https://go.googlesource.com/net]
Resolving golang.org/x/oauth2: commit='a032972e28060ca4f5644acffae3dfc268cc09db', urls=[https://go.googlesource.com/oauth2]
Resolving golang.org/x/sync: commit='fd80eb99c8f653c847d294a001bdf2a3a6f768f5', urls=[https://go.googlesource.com/sync]
Resolving golang.org/x/sys: commit='37707fdb30a5b38865cfb95e5aab41707daec7fd', urls=[https://go.googlesource.com/sys]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]

> Task :sdks:java:harness:shadowJar
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar

> Task :runners:spark:compileJava
<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/runners/spark/src/main/java/org/apache/beam/runners/spark/structuredstreaming/translation/batch/functions/SparkSideInputReader.java>:49: error: incompatible types: MultimapView is not a functional interface
      o -> Collections.EMPTY_LIST;
      ^
    multiple non-overriding abstract methods found in interface MultimapView
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
1 error

> Task :runners:spark:compileJava FAILED

> Task :sdks:go:resolveBuildDependencies
Resolving google.golang.org/api: commit='386d4e5f4f92f86e6aec85985761bba4b938a2d5', urls=[https://code.googlesource.com/google-api-go-client]
Resolving google.golang.org/genproto: commit='2b5a72b8730b0b16380010cfe5286c42108d88e7', urls=[https://github.com/google/go-genproto]
Resolving google.golang.org/grpc: commit='7646b5360d049a7ca31e9133315db43456f39e2e', urls=[https://github.com/grpc/grpc-go]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]

> Task :sdks:go:installDependencies
> Task :sdks:go:buildLinuxAmd64
> Task :sdks:go:goBuild

> Task :sdks:python:container:resolveBuildDependencies
Resolving ./github.com/apache/beam/sdks/go@<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/go>

> Task :sdks:python:container:resolveTestDependencies
> Task :sdks:python:container:installDependencies
> Task :sdks:python:container:buildDarwinAmd64
> Task :sdks:python:container:buildLinuxAmd64
> Task :sdks:python:container:goBuild
> Task :sdks:python:container:assemble
> Task :sdks:python:container:goVendor
> Task :sdks:python:container:goTest
> Task :sdks:python:container:goCover NO-SOURCE
> Task :sdks:python:container:goVet
> Task :sdks:python:container:goCheck
> Task :sdks:python:container:check
> Task :sdks:python:container:build

> Task :sdks:python:test-suites:portable:py2:createProcessWorker
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Obtaining file://<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python>
Processing /home/jenkins/.cache/pip/wheels/50/24/4d/4580ca4a299f1ad6fd63443e6e584cb21e9a07988e4aa8daac/crcmod-1.7-cp27-cp27mu-linux_x86_64.whl
Processing /home/jenkins/.cache/pip/wheels/59/b1/91/f02e76c732915c4015ab4010f3015469866c1eb9b14058d8e7/dill-0.3.1.1-cp27-none-any.whl
Collecting fastavro<0.22,>=0.21.4
  Using cached https://files.pythonhosted.org/packages/15/e3/5956c75f68906b119191ef30d9acff661b422cf918a29a03ee0c3ba774be/fastavro-0.21.24-cp27-cp27mu-manylinux1_x86_64.whl
Requirement already satisfied: future<1.0.0,>=0.16.0 in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.16.0)
Requirement already satisfied: grpcio<2,>=1.12.1 in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.25.0)
Processing /home/jenkins/.cache/pip/wheels/fe/a7/05/23e3699975fc20f8a30e00ac1e515ab8c61168e982abe4ce70/hdfs-2.5.8-cp27-none-any.whl
Processing /home/jenkins/.cache/pip/wheels/6d/41/4b/2b369d6e2b7eaebcdd423516d3fb659c7658c16a2be8fd04ec/httplib2-0.12.0-cp27-none-any.whl
Collecting mock<3.0.0,>=1.0.1
  Using cached https://files.pythonhosted.org/packages/e6/35/f187bdf23be87092bd0f1200d43d23076cee4d0dec109f195173fd3ebc79/mock-2.0.0-py2.py3-none-any.whl
Collecting numpy<2,>=1.14.3
  Using cached https://files.pythonhosted.org/packages/d7/b1/3367ea1f372957f97a6752ec725b87886e12af1415216feec9067e31df70/numpy-1.16.5-cp27-cp27mu-manylinux1_x86_64.whl
Collecting pymongo<4.0.0,>=3.8.0
  Using cached https://files.pythonhosted.org/packages/00/5c/5379d5b8167a5938918d9ee147f865f6f8a64b93947d402cfdca5c1416d2/pymongo-3.9.0-cp27-cp27mu-manylinux1_x86_64.whl
Processing /home/jenkins/.cache/pip/wheels/48/f7/87/b932f09c6335dbcf45d916937105a372ab14f353a9ca431d7d/oauth2client-3.0.0-cp27-none-any.whl
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (3.10.0)
Collecting pydot<2,>=1.2.0
  Using cached https://files.pythonhosted.org/packages/33/d1/b1479a770f66d962f545c2101630ce1d5592d90cb4f083d38862e93d16d2/pydot-1.4.1-py2.py3-none-any.whl
Collecting python-dateutil<3,>=2.8.0
  Using cached https://files.pythonhosted.org/packages/d4/70/d60450c3dd48ef87586924207ae8907090de0b306af2bce5d134d78615cb/python_dateutil-2.8.1-py2.py3-none-any.whl
Collecting pytz>=2018.3
  Using cached https://files.pythonhosted.org/packages/e7/f9/f0b53f88060247251bf481fa6ea62cd0d25bf1b11a87888e53ce5b7c8ad2/pytz-2019.3-py2.py3-none-any.whl
Processing /home/jenkins/.cache/pip/wheels/fe/03/0c/354f2a3b1e10ab337b45728410509de62fbc0f7fe09ad196a2/avro-1.9.1-cp27-none-any.whl
Collecting funcsigs<2,>=1.0.2
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Requirement already satisfied: futures<4.0.0,>=3.2.0 in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (3.3.0)
Processing /home/jenkins/.cache/pip/wheels/81/91/41/3272543c0b9c61da9c525f24ee35bae6fe8f60d4858c66805d/PyVCF-0.6.8-cp27-none-any.whl
Collecting typing<3.7.0,>=3.6.0
  Using cached https://files.pythonhosted.org/packages/cc/3e/29f92b7aeda5b078c86d14f550bf85cff809042e3429ace7af6193c3bc9f/typing-3.6.6-py2-none-any.whl
Collecting pyarrow<0.16.0,>=0.15.1
  Using cached https://files.pythonhosted.org/packages/f0/c8/f2987c293fea8897406bf93bca84ae67522c3f5d45f36751fe348310d538/pyarrow-0.15.1-cp27-cp27mu-manylinux2010_x86_64.whl
Collecting nose>=1.3.7
  Using cached https://files.pythonhosted.org/packages/99/4f/13fb671119e65c4dce97c60e67d3fd9e6f7f809f2b307e2611f4701205cb/nose-1.3.7-py2-none-any.whl
Processing /home/jenkins/.cache/pip/wheels/c4/1f/cd/9250fbf2fcc179e28bb4f7ee26a4fc7525914469d83a4f0c09/nose_xunitmp-0.4.1-cp27-none-any.whl
Collecting pandas<0.25,>=0.23.4
  Using cached https://files.pythonhosted.org/packages/db/83/7d4008ffc2988066ff37f6a0bb6d7b60822367dcb36ba5e39aa7801fda54/pandas-0.24.2-cp27-cp27mu-manylinux1_x86_64.whl
Collecting parameterized<0.7.0,>=0.6.0
  Using cached https://files.pythonhosted.org/packages/3a/49/75f6dadb09e2f8ace3cdffe0c99a04f1b98dff41fbf9e768665d8b469e29/parameterized-0.6.3-py2.py3-none-any.whl
Collecting pyhamcrest<2.0,>=1.9
  Using cached https://files.pythonhosted.org/packages/9a/d5/d37fd731b7d0e91afcc84577edeccf4638b4f9b82f5ffe2f8b62e2ddc609/PyHamcrest-1.9.0-py2.py3-none-any.whl
Processing /home/jenkins/.cache/pip/wheels/d9/45/dd/65f0b38450c47cf7e5312883deb97d065e030c5cca0a365030/PyYAML-5.1.2-cp27-cp27mu-linux_x86_64.whl
Collecting requests_mock<2.0,>=1.7
  Using cached https://files.pythonhosted.org/packages/8c/f1/66c54a412543b29454102ae74b1454fce2d307b1c36e6bd2e9818394df88/requests_mock-1.7.0-py2.py3-none-any.whl
Collecting tenacity<6.0,>=5.0.2
  Using cached https://files.pythonhosted.org/packages/45/67/67bb1db087678bc5c6f20766cf18914dfe37b0b9d4e4c5bb87408460b75f/tenacity-5.1.5-py2.py3-none-any.whl
Collecting pytest<5.0,>=4.4.0
  Using cached https://files.pythonhosted.org/packages/64/f1/187a98b8f913a8f3a53d213cca2fae19718565f36165804d7f4f91fe5b76/pytest-4.6.6-py2.py3-none-any.whl
Collecting pytest-xdist<2,>=1.29.0
  Using cached https://files.pythonhosted.org/packages/f7/80/2af1fc039f779f61c7207dc9f79a1479874e7795f869fddaf135efde1cd4/pytest_xdist-1.30.0-py2.py3-none-any.whl
Requirement already satisfied: six>=1.5.2 in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.18.0.dev0) (1.13.0)
Requirement already satisfied: enum34>=1.0.4; python_version < "3.4" in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.18.0.dev0) (1.1.6)
Processing /home/jenkins/.cache/pip/wheels/9b/04/dd/7daf4150b6d9b12949298737de9431a324d4b797ffd63f526e/docopt-0.6.2-py2.py3-none-any.whl
Collecting requests>=2.7.0
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting pbr>=0.11
  Using cached https://files.pythonhosted.org/packages/46/a4/d5c83831a3452713e4b4f126149bc4fbda170f7cb16a86a00ce57ce0e9ad/pbr-5.4.3-py2.py3-none-any.whl
Collecting pyasn1>=0.1.7
  Using cached https://files.pythonhosted.org/packages/62/1e/a94a8d635fa3ce4cfc7f506003548d0a2447ae76fd5ca53932970fe3053f/pyasn1-0.4.8-py2.py3-none-any.whl
Collecting pyasn1-modules>=0.0.5
  Using cached https://files.pythonhosted.org/packages/52/50/bb4cefca37da63a0c52218ba2cb1b1c36110d84dcbae8aa48cd67c5e95c2/pyasn1_modules-0.2.7-py2.py3-none-any.whl
Collecting rsa>=3.1.4
  Using cached https://files.pythonhosted.org/packages/02/e5/38518af393f7c214357079ce67a317307936896e961e35450b70fad2a9cf/rsa-4.0-py2.py3-none-any.whl
Requirement already satisfied: setuptools in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from protobuf<4,>=3.5.0.post1->apache-beam==2.18.0.dev0) (41.6.0)
Collecting pyparsing>=2.1.4
  Using cached https://files.pythonhosted.org/packages/c0/0c/fc2e007d9a992d997f04a80125b0f183da7fb554f1de701bbb70a8e7d479/pyparsing-2.4.5-py2.py3-none-any.whl
Collecting monotonic>=0.6; python_version == "2.7"
  Using cached https://files.pythonhosted.org/packages/ac/aa/063eca6a416f397bd99552c534c6d11d57f58f2e94c14780f3bbf818c4cf/monotonic-1.5-py2.py3-none-any.whl
Collecting atomicwrites>=1.0
  Using cached https://files.pythonhosted.org/packages/52/90/6155aa926f43f2b2a22b01be7241be3bfd1ceaf7d0b3267213e8127d41f4/atomicwrites-1.3.0-py2.py3-none-any.whl
Collecting packaging
  Using cached https://files.pythonhosted.org/packages/cf/94/9672c2d4b126e74c4496c6b3c58a8b51d6419267be9e70660ba23374c875/packaging-19.2-py2.py3-none-any.whl
Collecting wcwidth
  Using cached https://files.pythonhosted.org/packages/7e/9f/526a6947247599b084ee5232e4f9190a38f398d7300d866af3ab571a5bfe/wcwidth-0.1.7-py2.py3-none-any.whl
Requirement already satisfied: importlib-metadata>=0.12; python_version < "3.8" in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (0.23)
Requirement already satisfied: py>=1.5.0 in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (1.8.0)
Requirement already satisfied: pathlib2>=2.2.0; python_version < "3.6" in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (2.3.5)
Requirement already satisfied: pluggy<1.0,>=0.12 in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (0.13.0)
Collecting attrs>=17.4.0
  Using cached https://files.pythonhosted.org/packages/a2/db/4313ab3be961f7a763066401fb77f7748373b6094076ae2bda2806988af6/attrs-19.3.0-py2.py3-none-any.whl
Requirement already satisfied: more-itertools<6.0.0,>=4.0.0; python_version <= "2.7" in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (5.0.0)
Collecting pytest-forked
  Using cached https://files.pythonhosted.org/packages/03/1e/81235e1fcfed57a4e679d34794d60c01a1e9a29ef5b9844d797716111d80/pytest_forked-1.1.3-py2.py3-none-any.whl
Collecting execnet>=1.1
  Using cached https://files.pythonhosted.org/packages/d3/2e/c63af07fa471e0a02d05793c7a56a9f7d274a8489442a5dc4fb3b2b3c705/execnet-1.7.1-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1
  Using cached https://files.pythonhosted.org/packages/b4/40/a9837291310ee1ccc242ceb6ebfd9eb21539649f193a7c8c86ba15b98539/urllib3-1.25.7-py2.py3-none-any.whl
Collecting certifi>=2017.4.17
  Using cached https://files.pythonhosted.org/packages/18/b0/8146a4f8dd402f60744fa380bc73ca47303cccf8b9190fd16a827281eac2/certifi-2019.9.11-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Requirement already satisfied: contextlib2; python_version < "3" in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (0.6.0.post1)
Requirement already satisfied: zipp>=0.5 in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (0.6.0)
Requirement already satisfied: configparser>=3.5; python_version < "3" in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (4.0.2)
Requirement already satisfied: scandir; python_version < "3.5" in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from pathlib2>=2.2.0; python_version < "3.6"->pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (1.10.0)
Collecting apipkg>=1.4
  Using cached https://files.pythonhosted.org/packages/67/08/4815a09603fc800209431bec5b8bd2acf2f95abdfb558a44a42507fb94da/apipkg-1.5-py2.py3-none-any.whl
Installing collected packages: crcmod, dill, fastavro, docopt, urllib3, certifi, chardet, idna, requests, hdfs, httplib2, pbr, funcsigs, mock, numpy, pymongo, pyasn1, pyasn1-modules, rsa, oauth2client, pyparsing, pydot, python-dateutil, pytz, avro, pyvcf, typing, pyarrow, nose, nose-xunitmp, pandas, parameterized, pyhamcrest, pyyaml, requests-mock, monotonic, tenacity, atomicwrites, packaging, wcwidth, attrs, pytest, pytest-forked, apipkg, execnet, pytest-xdist, apache-beam
  Running setup.py develop for apache-beam
Successfully installed apache-beam apipkg-1.5 atomicwrites-1.3.0 attrs-19.3.0 avro-1.9.1 certifi-2019.9.11 chardet-3.0.4 crcmod-1.7 dill-0.3.1.1 docopt-0.6.2 execnet-1.7.1 fastavro-0.21.24 funcsigs-1.0.2 hdfs-2.5.8 httplib2-0.12.0 idna-2.8 mock-2.0.0 monotonic-1.5 nose-1.3.7 nose-xunitmp-0.4.1 numpy-1.16.5 oauth2client-3.0.0 packaging-19.2 pandas-0.24.2 parameterized-0.6.3 pbr-5.4.3 pyarrow-0.15.1 pyasn1-0.4.8 pyasn1-modules-0.2.7 pydot-1.4.1 pyhamcrest-1.9.0 pymongo-3.9.0 pyparsing-2.4.5 pytest-4.6.6 pytest-forked-1.1.3 pytest-xdist-1.30.0 python-dateutil-2.8.1 pytz-2019.3 pyvcf-0.6.8 pyyaml-5.1.2 requests-2.22.0 requests-mock-1.7.0 rsa-4.0 tenacity-5.1.5 typing-3.6.6 urllib3-1.25.7 wcwidth-0.1.7

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':runners:spark:compileJava'.
> Compilation failed with exit code 1; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2m 44s
57 actionable tasks: 45 executed, 12 from cache

Publishing build scan...
https://gradle.com/s/l33oly5s7uctc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1585

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1585/display/redirect?page=changes>

Changes:

[github] [BEAM-3419] Flesh out iterable side inputs and key enumeration for


------------------------------------------
[...truncated 28.03 KB...]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/hashicorp/hcl: commit='23c074d0eceb2b8a5bfdbb271ab780cde70f05a8', urls=[https://github.com/hashicorp/hcl.git, git@github.com:hashicorp/hcl.git]
Resolving github.com/ianlancetaylor/demangle: commit='4883227f66371e02c4948937d3e2be1664d9be38', urls=[https://github.com/ianlancetaylor/demangle.git, git@github.com:ianlancetaylor/demangle.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/kr/fs: commit='2788f0dbd16903de03cb8186e5c7d97b69ad387b', urls=[https://github.com/kr/fs.git, git@github.com:kr/fs.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/magiconair/properties: commit='49d762b9817ba1c2e9d0c69183c2b4a8b8f1d934', urls=[https://github.com/magiconair/properties.git, git@github.com:magiconair/properties.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/mitchellh/go-homedir: commit='b8bc1bf767474819792c23f32d8286a45736f1c6', urls=[https://github.com/mitchellh/go-homedir.git, git@github.com:mitchellh/go-homedir.git]
Resolving github.com/mitchellh/mapstructure: commit='a4e142e9c047c904fa2f1e144d9a84e6133024bc', urls=[https://github.com/mitchellh/mapstructure.git, git@github.com:mitchellh/mapstructure.git]
Resolving github.com/nightlyone/lockfile: commit='0ad87eef1443f64d3d8c50da647e2b1552851124', urls=[https://github.com/nightlyone/lockfile, git@github.com:nightlyone/lockfile.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/openzipkin/zipkin-go: commit='3741243b287094fda649c7f0fa74bd51f37dc122', urls=[https://github.com/openzipkin/zipkin-go.git, git@github.com:openzipkin/zipkin-go.git]
Resolving github.com/pelletier/go-toml: commit='acdc4509485b587f5e675510c4f2c63e90ff68a8', urls=[https://github.com/pelletier/go-toml.git, git@github.com:pelletier/go-toml.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/pierrec/lz4: commit='ed8d4cc3b461464e69798080a0092bd028910298', urls=[https://github.com/pierrec/lz4.git, git@github.com:pierrec/lz4.git]
Resolving github.com/pierrec/xxHash: commit='a0006b13c722f7f12368c00a3d3c2ae8a999a0c6', urls=[https://github.com/pierrec/xxHash.git, git@github.com:pierrec/xxHash.git]
Resolving github.com/pkg/errors: commit='30136e27e2ac8d167177e8a583aa4c3fea5be833', urls=[https://github.com/pkg/errors.git, git@github.com:pkg/errors.git]
Resolving github.com/pkg/sftp: commit='22e9c1ccc02fc1b9fa3264572e49109b68a86947', urls=[https://github.com/pkg/sftp.git, git@github.com:pkg/sftp.git]
Resolving github.com/prometheus/client_golang: commit='9bb6ab929dcbe1c8393cd9ef70387cb69811bd1c', urls=[https://github.com/prometheus/client_golang.git, git@github.com:prometheus/client_golang.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/prometheus/procfs: commit='cb4147076ac75738c9a7d279075a253c0cc5acbd', urls=[https://github.com/prometheus/procfs.git, git@github.com:prometheus/procfs.git]
Resolving github.com/rcrowley/go-metrics: commit='8732c616f52954686704c8645fe1a9d59e9df7c1', urls=[https://github.com/rcrowley/go-metrics.git, git@github.com:rcrowley/go-metrics.git]
Resolving github.com/cpuguy83/go-md2man: commit='dc9f53734905c233adfc09fd4f063dce63ce3daf', urls=[https://github.com/cpuguy83/go-md2man.git, git@github.com:cpuguy83/go-md2man.git]
Resolving cached github.com/cpuguy83/go-md2man: commit='dc9f53734905c233adfc09fd4f063dce63ce3daf', urls=[https://github.com/cpuguy83/go-md2man.git, git@github.com:cpuguy83/go-md2man.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/spf13/afero: commit='bb8f1927f2a9d3ab41c9340aa034f6b803f4359c', urls=[https://github.com/spf13/afero.git, git@github.com:spf13/afero.git]
Resolving github.com/spf13/cast: commit='acbeb36b902d72a7a4c18e8f3241075e7ab763e4', urls=[https://github.com/spf13/cast.git, git@github.com:spf13/cast.git]
Resolving github.com/spf13/cobra: commit='93959269ad99e80983c9ba742a7e01203a4c0e4f', urls=[https://github.com/spf13/cobra.git, git@github.com:spf13/cobra.git]
Resolving github.com/spf13/jwalterweatherman: commit='7c0cea34c8ece3fbeb2b27ab9b59511d360fb394', urls=[https://github.com/spf13/jwalterweatherman.git, git@github.com:spf13/jwalterweatherman.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/spf13/viper: commit='aafc9e6bc7b7bb53ddaa75a5ef49a17d6e654be5', urls=[https://github.com/spf13/viper.git, git@github.com:spf13/viper.git]

> Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar
> Task :runners:core-construction-java:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar

> Task :sdks:go:resolveBuildDependencies
Resolving github.com/stathat/go: commit='74669b9f388d9d788c97399a0824adbfee78400e', urls=[https://github.com/stathat/go.git, git@github.com:stathat/go.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving github.com/xordataexchange/crypt: commit='b2862e3d0a775f18c7cfe02273500ae307b61218', urls=[https://github.com/xordataexchange/crypt.git, git@github.com:xordataexchange/crypt.git]
Resolving go.opencensus.io: commit='aa2b39d1618ef56ba156f27cfcdae9042f68f0bc', urls=[https://github.com/census-instrumentation/opencensus-go]
Resolving golang.org/x/crypto: commit='d9133f5469342136e669e85192a26056b587f503', urls=[https://go.googlesource.com/crypto]
Resolving golang.org/x/debug: commit='95515998a8a4bd7448134b2cb5971dbeb12e0b77', urls=[https://go.googlesource.com/debug]
Resolving golang.org/x/net: commit='2fb46b16b8dda405028c50f7c7f0f9dd1fa6bfb1', urls=[https://go.googlesource.com/net]
Resolving golang.org/x/oauth2: commit='a032972e28060ca4f5644acffae3dfc268cc09db', urls=[https://go.googlesource.com/oauth2]
Resolving golang.org/x/sync: commit='fd80eb99c8f653c847d294a001bdf2a3a6f768f5', urls=[https://go.googlesource.com/sync]
Resolving golang.org/x/sys: commit='37707fdb30a5b38865cfb95e5aab41707daec7fd', urls=[https://go.googlesource.com/sys]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]

> Task :sdks:java:harness:shadowJar
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar

> Task :runners:spark:compileJava
<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/runners/spark/src/main/java/org/apache/beam/runners/spark/structuredstreaming/translation/batch/functions/SparkSideInputReader.java>:49: error: incompatible types: MultimapView is not a functional interface
      o -> Collections.EMPTY_LIST;
      ^
    multiple non-overriding abstract methods found in interface MultimapView
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
1 error

> Task :runners:spark:compileJava FAILED

> Task :sdks:go:resolveBuildDependencies
Resolving google.golang.org/api: commit='386d4e5f4f92f86e6aec85985761bba4b938a2d5', urls=[https://code.googlesource.com/google-api-go-client]
Resolving google.golang.org/genproto: commit='2b5a72b8730b0b16380010cfe5286c42108d88e7', urls=[https://github.com/google/go-genproto]
Resolving google.golang.org/grpc: commit='7646b5360d049a7ca31e9133315db43456f39e2e', urls=[https://github.com/grpc/grpc-go]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]

> Task :sdks:go:installDependencies
> Task :sdks:go:buildLinuxAmd64
> Task :sdks:go:goBuild

> Task :sdks:python:container:resolveBuildDependencies
Resolving ./github.com/apache/beam/sdks/go@<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/go>

> Task :sdks:python:container:resolveTestDependencies
> Task :sdks:python:container:installDependencies
> Task :sdks:python:container:buildDarwinAmd64
> Task :sdks:python:container:buildLinuxAmd64
> Task :sdks:python:container:goBuild
> Task :sdks:python:container:assemble
> Task :sdks:python:container:goVendor
> Task :sdks:python:container:goTest
> Task :sdks:python:container:goCover NO-SOURCE
> Task :sdks:python:container:goVet
> Task :sdks:python:container:goCheck
> Task :sdks:python:container:check
> Task :sdks:python:container:build

> Task :sdks:python:test-suites:portable:py2:createProcessWorker
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Obtaining file://<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python>
Processing /home/jenkins/.cache/pip/wheels/50/24/4d/4580ca4a299f1ad6fd63443e6e584cb21e9a07988e4aa8daac/crcmod-1.7-cp27-cp27mu-linux_x86_64.whl
Processing /home/jenkins/.cache/pip/wheels/59/b1/91/f02e76c732915c4015ab4010f3015469866c1eb9b14058d8e7/dill-0.3.1.1-cp27-none-any.whl
Collecting fastavro<0.22,>=0.21.4
  Using cached https://files.pythonhosted.org/packages/15/e3/5956c75f68906b119191ef30d9acff661b422cf918a29a03ee0c3ba774be/fastavro-0.21.24-cp27-cp27mu-manylinux1_x86_64.whl
Requirement already satisfied: future<1.0.0,>=0.16.0 in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (0.16.0)
Requirement already satisfied: grpcio<2,>=1.12.1 in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (1.25.0)
Processing /home/jenkins/.cache/pip/wheels/fe/a7/05/23e3699975fc20f8a30e00ac1e515ab8c61168e982abe4ce70/hdfs-2.5.8-cp27-none-any.whl
Processing /home/jenkins/.cache/pip/wheels/6d/41/4b/2b369d6e2b7eaebcdd423516d3fb659c7658c16a2be8fd04ec/httplib2-0.12.0-cp27-none-any.whl
Collecting mock<3.0.0,>=1.0.1
  Using cached https://files.pythonhosted.org/packages/e6/35/f187bdf23be87092bd0f1200d43d23076cee4d0dec109f195173fd3ebc79/mock-2.0.0-py2.py3-none-any.whl
Collecting numpy<2,>=1.14.3
  Using cached https://files.pythonhosted.org/packages/d7/b1/3367ea1f372957f97a6752ec725b87886e12af1415216feec9067e31df70/numpy-1.16.5-cp27-cp27mu-manylinux1_x86_64.whl
Collecting pymongo<4.0.0,>=3.8.0
  Using cached https://files.pythonhosted.org/packages/00/5c/5379d5b8167a5938918d9ee147f865f6f8a64b93947d402cfdca5c1416d2/pymongo-3.9.0-cp27-cp27mu-manylinux1_x86_64.whl
Processing /home/jenkins/.cache/pip/wheels/48/f7/87/b932f09c6335dbcf45d916937105a372ab14f353a9ca431d7d/oauth2client-3.0.0-cp27-none-any.whl
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (3.10.0)
Collecting pydot<2,>=1.2.0
  Using cached https://files.pythonhosted.org/packages/33/d1/b1479a770f66d962f545c2101630ce1d5592d90cb4f083d38862e93d16d2/pydot-1.4.1-py2.py3-none-any.whl
Collecting python-dateutil<3,>=2.8.0
  Using cached https://files.pythonhosted.org/packages/d4/70/d60450c3dd48ef87586924207ae8907090de0b306af2bce5d134d78615cb/python_dateutil-2.8.1-py2.py3-none-any.whl
Collecting pytz>=2018.3
  Using cached https://files.pythonhosted.org/packages/e7/f9/f0b53f88060247251bf481fa6ea62cd0d25bf1b11a87888e53ce5b7c8ad2/pytz-2019.3-py2.py3-none-any.whl
Processing /home/jenkins/.cache/pip/wheels/fe/03/0c/354f2a3b1e10ab337b45728410509de62fbc0f7fe09ad196a2/avro-1.9.1-cp27-none-any.whl
Collecting funcsigs<2,>=1.0.2
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Requirement already satisfied: futures<4.0.0,>=3.2.0 in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from apache-beam==2.18.0.dev0) (3.3.0)
Processing /home/jenkins/.cache/pip/wheels/81/91/41/3272543c0b9c61da9c525f24ee35bae6fe8f60d4858c66805d/PyVCF-0.6.8-cp27-none-any.whl
Collecting typing<3.7.0,>=3.6.0
  Using cached https://files.pythonhosted.org/packages/cc/3e/29f92b7aeda5b078c86d14f550bf85cff809042e3429ace7af6193c3bc9f/typing-3.6.6-py2-none-any.whl
Collecting pyarrow<0.16.0,>=0.15.1
  Using cached https://files.pythonhosted.org/packages/f0/c8/f2987c293fea8897406bf93bca84ae67522c3f5d45f36751fe348310d538/pyarrow-0.15.1-cp27-cp27mu-manylinux2010_x86_64.whl
Collecting nose>=1.3.7
  Using cached https://files.pythonhosted.org/packages/99/4f/13fb671119e65c4dce97c60e67d3fd9e6f7f809f2b307e2611f4701205cb/nose-1.3.7-py2-none-any.whl
Processing /home/jenkins/.cache/pip/wheels/c4/1f/cd/9250fbf2fcc179e28bb4f7ee26a4fc7525914469d83a4f0c09/nose_xunitmp-0.4.1-cp27-none-any.whl
Collecting pandas<0.25,>=0.23.4
  Using cached https://files.pythonhosted.org/packages/db/83/7d4008ffc2988066ff37f6a0bb6d7b60822367dcb36ba5e39aa7801fda54/pandas-0.24.2-cp27-cp27mu-manylinux1_x86_64.whl
Collecting parameterized<0.7.0,>=0.6.0
  Using cached https://files.pythonhosted.org/packages/3a/49/75f6dadb09e2f8ace3cdffe0c99a04f1b98dff41fbf9e768665d8b469e29/parameterized-0.6.3-py2.py3-none-any.whl
Collecting pyhamcrest<2.0,>=1.9
  Using cached https://files.pythonhosted.org/packages/9a/d5/d37fd731b7d0e91afcc84577edeccf4638b4f9b82f5ffe2f8b62e2ddc609/PyHamcrest-1.9.0-py2.py3-none-any.whl
Processing /home/jenkins/.cache/pip/wheels/d9/45/dd/65f0b38450c47cf7e5312883deb97d065e030c5cca0a365030/PyYAML-5.1.2-cp27-cp27mu-linux_x86_64.whl
Collecting requests_mock<2.0,>=1.7
  Using cached https://files.pythonhosted.org/packages/8c/f1/66c54a412543b29454102ae74b1454fce2d307b1c36e6bd2e9818394df88/requests_mock-1.7.0-py2.py3-none-any.whl
Collecting tenacity<6.0,>=5.0.2
  Using cached https://files.pythonhosted.org/packages/45/67/67bb1db087678bc5c6f20766cf18914dfe37b0b9d4e4c5bb87408460b75f/tenacity-5.1.5-py2.py3-none-any.whl
Collecting pytest<5.0,>=4.4.0
  Using cached https://files.pythonhosted.org/packages/64/f1/187a98b8f913a8f3a53d213cca2fae19718565f36165804d7f4f91fe5b76/pytest-4.6.6-py2.py3-none-any.whl
Collecting pytest-xdist<2,>=1.29.0
  Using cached https://files.pythonhosted.org/packages/f7/80/2af1fc039f779f61c7207dc9f79a1479874e7795f869fddaf135efde1cd4/pytest_xdist-1.30.0-py2.py3-none-any.whl
Requirement already satisfied: six>=1.5.2 in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.18.0.dev0) (1.13.0)
Requirement already satisfied: enum34>=1.0.4; python_version < "3.4" in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.18.0.dev0) (1.1.6)
Processing /home/jenkins/.cache/pip/wheels/9b/04/dd/7daf4150b6d9b12949298737de9431a324d4b797ffd63f526e/docopt-0.6.2-py2.py3-none-any.whl
Collecting requests>=2.7.0
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting pbr>=0.11
  Using cached https://files.pythonhosted.org/packages/46/a4/d5c83831a3452713e4b4f126149bc4fbda170f7cb16a86a00ce57ce0e9ad/pbr-5.4.3-py2.py3-none-any.whl
Collecting pyasn1>=0.1.7
  Using cached https://files.pythonhosted.org/packages/62/1e/a94a8d635fa3ce4cfc7f506003548d0a2447ae76fd5ca53932970fe3053f/pyasn1-0.4.8-py2.py3-none-any.whl
Collecting pyasn1-modules>=0.0.5
  Using cached https://files.pythonhosted.org/packages/52/50/bb4cefca37da63a0c52218ba2cb1b1c36110d84dcbae8aa48cd67c5e95c2/pyasn1_modules-0.2.7-py2.py3-none-any.whl
Collecting rsa>=3.1.4
  Using cached https://files.pythonhosted.org/packages/02/e5/38518af393f7c214357079ce67a317307936896e961e35450b70fad2a9cf/rsa-4.0-py2.py3-none-any.whl
Requirement already satisfied: setuptools in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from protobuf<4,>=3.5.0.post1->apache-beam==2.18.0.dev0) (41.6.0)
Collecting pyparsing>=2.1.4
  Using cached https://files.pythonhosted.org/packages/c0/0c/fc2e007d9a992d997f04a80125b0f183da7fb554f1de701bbb70a8e7d479/pyparsing-2.4.5-py2.py3-none-any.whl
Collecting monotonic>=0.6; python_version == "2.7"
  Using cached https://files.pythonhosted.org/packages/ac/aa/063eca6a416f397bd99552c534c6d11d57f58f2e94c14780f3bbf818c4cf/monotonic-1.5-py2.py3-none-any.whl
Collecting atomicwrites>=1.0
  Using cached https://files.pythonhosted.org/packages/52/90/6155aa926f43f2b2a22b01be7241be3bfd1ceaf7d0b3267213e8127d41f4/atomicwrites-1.3.0-py2.py3-none-any.whl
Collecting packaging
  Using cached https://files.pythonhosted.org/packages/cf/94/9672c2d4b126e74c4496c6b3c58a8b51d6419267be9e70660ba23374c875/packaging-19.2-py2.py3-none-any.whl
Collecting wcwidth
  Using cached https://files.pythonhosted.org/packages/7e/9f/526a6947247599b084ee5232e4f9190a38f398d7300d866af3ab571a5bfe/wcwidth-0.1.7-py2.py3-none-any.whl
Requirement already satisfied: importlib-metadata>=0.12; python_version < "3.8" in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (0.23)
Requirement already satisfied: py>=1.5.0 in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (1.8.0)
Requirement already satisfied: pathlib2>=2.2.0; python_version < "3.6" in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (2.3.5)
Requirement already satisfied: pluggy<1.0,>=0.12 in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (0.13.0)
Collecting attrs>=17.4.0
  Using cached https://files.pythonhosted.org/packages/a2/db/4313ab3be961f7a763066401fb77f7748373b6094076ae2bda2806988af6/attrs-19.3.0-py2.py3-none-any.whl
Requirement already satisfied: more-itertools<6.0.0,>=4.0.0; python_version <= "2.7" in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (5.0.0)
Collecting pytest-forked
  Using cached https://files.pythonhosted.org/packages/03/1e/81235e1fcfed57a4e679d34794d60c01a1e9a29ef5b9844d797716111d80/pytest_forked-1.1.3-py2.py3-none-any.whl
Collecting execnet>=1.1
  Using cached https://files.pythonhosted.org/packages/d3/2e/c63af07fa471e0a02d05793c7a56a9f7d274a8489442a5dc4fb3b2b3c705/execnet-1.7.1-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1
  Using cached https://files.pythonhosted.org/packages/b4/40/a9837291310ee1ccc242ceb6ebfd9eb21539649f193a7c8c86ba15b98539/urllib3-1.25.7-py2.py3-none-any.whl
Collecting certifi>=2017.4.17
  Using cached https://files.pythonhosted.org/packages/18/b0/8146a4f8dd402f60744fa380bc73ca47303cccf8b9190fd16a827281eac2/certifi-2019.9.11-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Requirement already satisfied: contextlib2; python_version < "3" in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (0.6.0.post1)
Requirement already satisfied: zipp>=0.5 in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (0.6.0)
Requirement already satisfied: configparser>=3.5; python_version < "3" in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (4.0.2)
Requirement already satisfied: scandir; python_version < "3.5" in <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/lib/python2.7/site-packages> (from pathlib2>=2.2.0; python_version < "3.6"->pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (1.10.0)
Collecting apipkg>=1.4
  Using cached https://files.pythonhosted.org/packages/67/08/4815a09603fc800209431bec5b8bd2acf2f95abdfb558a44a42507fb94da/apipkg-1.5-py2.py3-none-any.whl
Installing collected packages: crcmod, dill, fastavro, docopt, urllib3, certifi, chardet, idna, requests, hdfs, httplib2, pbr, funcsigs, mock, numpy, pymongo, pyasn1, pyasn1-modules, rsa, oauth2client, pyparsing, pydot, python-dateutil, pytz, avro, pyvcf, typing, pyarrow, nose, nose-xunitmp, pandas, parameterized, pyhamcrest, pyyaml, requests-mock, monotonic, tenacity, atomicwrites, packaging, wcwidth, attrs, pytest, pytest-forked, apipkg, execnet, pytest-xdist, apache-beam
  Running setup.py develop for apache-beam
Successfully installed apache-beam apipkg-1.5 atomicwrites-1.3.0 attrs-19.3.0 avro-1.9.1 certifi-2019.9.11 chardet-3.0.4 crcmod-1.7 dill-0.3.1.1 docopt-0.6.2 execnet-1.7.1 fastavro-0.21.24 funcsigs-1.0.2 hdfs-2.5.8 httplib2-0.12.0 idna-2.8 mock-2.0.0 monotonic-1.5 nose-1.3.7 nose-xunitmp-0.4.1 numpy-1.16.5 oauth2client-3.0.0 packaging-19.2 pandas-0.24.2 parameterized-0.6.3 pbr-5.4.3 pyarrow-0.15.1 pyasn1-0.4.8 pyasn1-modules-0.2.7 pydot-1.4.1 pyhamcrest-1.9.0 pymongo-3.9.0 pyparsing-2.4.5 pytest-4.6.6 pytest-forked-1.1.3 pytest-xdist-1.30.0 python-dateutil-2.8.1 pytz-2019.3 pyvcf-0.6.8 pyyaml-5.1.2 requests-2.22.0 requests-mock-1.7.0 rsa-4.0 tenacity-5.1.5 typing-3.6.6 urllib3-1.25.7 wcwidth-0.1.7

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':runners:spark:compileJava'.
> Compilation failed with exit code 1; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2m 30s
57 actionable tasks: 45 executed, 12 from cache

Publishing build scan...
https://gradle.com/s/qhqbxkvwhg36c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1584

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1584/display/redirect?page=changes>

Changes:

[valentyn] Restore original behavior of evaluating worker host on Windows until a

[tvalentyn] [BEAM-8575] Add a Python test to test windowing in DoFn finish_bundle()


------------------------------------------
[...truncated 1.32 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 19:14:15 INFO sdk_worker.run: No more requests from control plane
19/11/20 19:14:15 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 19:14:15 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 19:14:15 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 19:14:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 19:14:15 INFO sdk_worker.run: Done consuming work.
19/11/20 19:14:15 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 19:14:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 19:14:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 19:14:15 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestOOq7us/job_ed6b8902-ae16-4424-9fdd-6a8fee08e82d/MANIFEST
19/11/20 19:14:15 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestOOq7us/job_ed6b8902-ae16-4424-9fdd-6a8fee08e82d/MANIFEST -> 0 artifacts
19/11/20 19:14:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 19:14:16 INFO sdk_worker_main.main: Logging handler created.
19/11/20 19:14:16 INFO sdk_worker_main.start: Status HTTP server running at localhost:34245
19/11/20 19:14:16 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/20 19:14:16 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 19:14:16 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574277253.98_9bf73e42-a507-4e31-92ca-7394328931e8', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 19:14:16 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574277253.98', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59367'}
19/11/20 19:14:16 INFO statecache.__init__: Creating state cache with size 0
19/11/20 19:14:16 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41077.
19/11/20 19:14:16 INFO sdk_worker.__init__: Control channel established.
19/11/20 19:14:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/20 19:14:16 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 19:14:16 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37003.
19/11/20 19:14:16 INFO sdk_worker.create_state_handler: State channel established.
19/11/20 19:14:16 INFO data_plane.create_data_channel: Creating client data channel for localhost:33635
19/11/20 19:14:16 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/20 19:14:16 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 19:14:16 INFO sdk_worker.run: No more requests from control plane
19/11/20 19:14:16 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 19:14:16 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 19:14:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 19:14:16 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 19:14:16 INFO sdk_worker.run: Done consuming work.
19/11/20 19:14:16 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 19:14:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 19:14:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 19:14:16 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestOOq7us/job_ed6b8902-ae16-4424-9fdd-6a8fee08e82d/MANIFEST
19/11/20 19:14:16 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestOOq7us/job_ed6b8902-ae16-4424-9fdd-6a8fee08e82d/MANIFEST -> 0 artifacts
19/11/20 19:14:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 19:14:17 INFO sdk_worker_main.main: Logging handler created.
19/11/20 19:14:17 INFO sdk_worker_main.start: Status HTTP server running at localhost:39895
19/11/20 19:14:17 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/20 19:14:17 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 19:14:17 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574277253.98_9bf73e42-a507-4e31-92ca-7394328931e8', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 19:14:17 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574277253.98', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59367'}
19/11/20 19:14:17 INFO statecache.__init__: Creating state cache with size 0
19/11/20 19:14:17 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45021.
19/11/20 19:14:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/20 19:14:17 INFO sdk_worker.__init__: Control channel established.
19/11/20 19:14:17 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 19:14:17 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42387.
19/11/20 19:14:17 INFO sdk_worker.create_state_handler: State channel established.
19/11/20 19:14:17 INFO data_plane.create_data_channel: Creating client data channel for localhost:44627
19/11/20 19:14:17 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/20 19:14:17 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 19:14:17 INFO sdk_worker.run: No more requests from control plane
19/11/20 19:14:17 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 19:14:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 19:14:17 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 19:14:17 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 19:14:17 INFO sdk_worker.run: Done consuming work.
19/11/20 19:14:17 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 19:14:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 19:14:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 19:14:17 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestOOq7us/job_ed6b8902-ae16-4424-9fdd-6a8fee08e82d/MANIFEST
19/11/20 19:14:17 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestOOq7us/job_ed6b8902-ae16-4424-9fdd-6a8fee08e82d/MANIFEST -> 0 artifacts
19/11/20 19:14:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 19:14:18 INFO sdk_worker_main.main: Logging handler created.
19/11/20 19:14:18 INFO sdk_worker_main.start: Status HTTP server running at localhost:46343
19/11/20 19:14:18 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/20 19:14:18 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 19:14:18 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574277253.98_9bf73e42-a507-4e31-92ca-7394328931e8', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 19:14:18 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574277253.98', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59367'}
19/11/20 19:14:18 INFO statecache.__init__: Creating state cache with size 0
19/11/20 19:14:18 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40907.
19/11/20 19:14:18 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/20 19:14:18 INFO sdk_worker.__init__: Control channel established.
19/11/20 19:14:18 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 19:14:18 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39025.
19/11/20 19:14:18 INFO sdk_worker.create_state_handler: State channel established.
19/11/20 19:14:18 INFO data_plane.create_data_channel: Creating client data channel for localhost:33269
19/11/20 19:14:18 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/20 19:14:18 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 19:14:18 INFO sdk_worker.run: No more requests from control plane
19/11/20 19:14:18 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 19:14:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 19:14:18 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 19:14:18 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 19:14:18 INFO sdk_worker.run: Done consuming work.
19/11/20 19:14:18 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 19:14:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 19:14:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 19:14:18 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestOOq7us/job_ed6b8902-ae16-4424-9fdd-6a8fee08e82d/MANIFEST
19/11/20 19:14:18 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestOOq7us/job_ed6b8902-ae16-4424-9fdd-6a8fee08e82d/MANIFEST -> 0 artifacts
19/11/20 19:14:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 19:14:19 INFO sdk_worker_main.main: Logging handler created.
19/11/20 19:14:19 INFO sdk_worker_main.start: Status HTTP server running at localhost:45939
19/11/20 19:14:19 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/20 19:14:19 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 19:14:19 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574277253.98_9bf73e42-a507-4e31-92ca-7394328931e8', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 19:14:19 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574277253.98', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59367'}
19/11/20 19:14:19 INFO statecache.__init__: Creating state cache with size 0
19/11/20 19:14:19 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42623.
19/11/20 19:14:19 INFO sdk_worker.__init__: Control channel established.
19/11/20 19:14:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/20 19:14:19 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 19:14:19 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43403.
19/11/20 19:14:19 INFO sdk_worker.create_state_handler: State channel established.
19/11/20 19:14:19 INFO data_plane.create_data_channel: Creating client data channel for localhost:34361
19/11/20 19:14:19 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/20 19:14:19 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 19:14:19 INFO sdk_worker.run: No more requests from control plane
19/11/20 19:14:19 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 19:14:19 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 19:14:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 19:14:19 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 19:14:19 INFO sdk_worker.run: Done consuming work.
19/11/20 19:14:19 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 19:14:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 19:14:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 19:14:19 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574277253.98_9bf73e42-a507-4e31-92ca-7394328931e8 finished.
19/11/20 19:14:19 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/20 19:14:19 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestOOq7us/job_ed6b8902-ae16-4424-9fdd-6a8fee08e82d/MANIFEST has 0 artifact locations
19/11/20 19:14:19 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestOOq7us/job_ed6b8902-ae16-4424-9fdd-6a8fee08e82d/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140603866212096)>

# Thread: <Thread(Thread-117, started daemon 140603857819392)>

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <_MainThread(MainThread, started 140604728981248)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 140603849426688)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-123, started daemon 140603841033984)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-117, started daemon 140603857819392)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <_MainThread(MainThread, started 140604728981248)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 140603866212096)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574277244.51_b57b91ce-c06c-4771-a521-b1a93f091bfe failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 319.106s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 51s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/7pxsli7o4mlpm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1583

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1583/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 18:16:33 INFO sdk_worker.run: No more requests from control plane
19/11/20 18:16:33 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 18:16:33 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 18:16:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 18:16:33 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 18:16:33 INFO sdk_worker.run: Done consuming work.
19/11/20 18:16:33 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 18:16:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 18:16:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 18:16:33 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestKZGyLG/job_f89e68fe-7caf-4fa7-8e03-b566c8a8baed/MANIFEST
19/11/20 18:16:33 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestKZGyLG/job_f89e68fe-7caf-4fa7-8e03-b566c8a8baed/MANIFEST -> 0 artifacts
19/11/20 18:16:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 18:16:34 INFO sdk_worker_main.main: Logging handler created.
19/11/20 18:16:34 INFO sdk_worker_main.start: Status HTTP server running at localhost:33485
19/11/20 18:16:34 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/20 18:16:34 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 18:16:34 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574273791.59_0d4b4012-9da0-4544-8298-4d1be898d48f', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 18:16:34 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574273791.59', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36781'}
19/11/20 18:16:34 INFO statecache.__init__: Creating state cache with size 0
19/11/20 18:16:34 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43273.
19/11/20 18:16:34 INFO sdk_worker.__init__: Control channel established.
19/11/20 18:16:34 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 18:16:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/20 18:16:34 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41885.
19/11/20 18:16:34 INFO sdk_worker.create_state_handler: State channel established.
19/11/20 18:16:34 INFO data_plane.create_data_channel: Creating client data channel for localhost:40517
19/11/20 18:16:34 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/20 18:16:34 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 18:16:34 INFO sdk_worker.run: No more requests from control plane
19/11/20 18:16:34 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 18:16:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 18:16:34 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 18:16:34 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 18:16:34 INFO sdk_worker.run: Done consuming work.
19/11/20 18:16:34 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 18:16:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 18:16:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 18:16:34 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestKZGyLG/job_f89e68fe-7caf-4fa7-8e03-b566c8a8baed/MANIFEST
19/11/20 18:16:34 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestKZGyLG/job_f89e68fe-7caf-4fa7-8e03-b566c8a8baed/MANIFEST -> 0 artifacts
19/11/20 18:16:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 18:16:35 INFO sdk_worker_main.main: Logging handler created.
19/11/20 18:16:35 INFO sdk_worker_main.start: Status HTTP server running at localhost:37303
19/11/20 18:16:35 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/20 18:16:35 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 18:16:35 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574273791.59_0d4b4012-9da0-4544-8298-4d1be898d48f', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 18:16:35 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574273791.59', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36781'}
19/11/20 18:16:35 INFO statecache.__init__: Creating state cache with size 0
19/11/20 18:16:35 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36883.
19/11/20 18:16:35 INFO sdk_worker.__init__: Control channel established.
19/11/20 18:16:35 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 18:16:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/20 18:16:35 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34991.
19/11/20 18:16:35 INFO sdk_worker.create_state_handler: State channel established.
19/11/20 18:16:35 INFO data_plane.create_data_channel: Creating client data channel for localhost:42903
19/11/20 18:16:35 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/20 18:16:35 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 18:16:35 INFO sdk_worker.run: No more requests from control plane
19/11/20 18:16:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 18:16:35 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 18:16:35 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 18:16:35 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 18:16:35 INFO sdk_worker.run: Done consuming work.
19/11/20 18:16:35 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 18:16:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 18:16:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 18:16:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestKZGyLG/job_f89e68fe-7caf-4fa7-8e03-b566c8a8baed/MANIFEST
19/11/20 18:16:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestKZGyLG/job_f89e68fe-7caf-4fa7-8e03-b566c8a8baed/MANIFEST -> 0 artifacts
19/11/20 18:16:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 18:16:36 INFO sdk_worker_main.main: Logging handler created.
19/11/20 18:16:36 INFO sdk_worker_main.start: Status HTTP server running at localhost:46025
19/11/20 18:16:36 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/20 18:16:36 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 18:16:36 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574273791.59_0d4b4012-9da0-4544-8298-4d1be898d48f', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 18:16:36 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574273791.59', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36781'}
19/11/20 18:16:36 INFO statecache.__init__: Creating state cache with size 0
19/11/20 18:16:36 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43245.
19/11/20 18:16:36 INFO sdk_worker.__init__: Control channel established.
19/11/20 18:16:36 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 18:16:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/20 18:16:36 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46067.
19/11/20 18:16:36 INFO sdk_worker.create_state_handler: State channel established.
19/11/20 18:16:36 INFO data_plane.create_data_channel: Creating client data channel for localhost:45015
19/11/20 18:16:36 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/20 18:16:36 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 18:16:36 INFO sdk_worker.run: No more requests from control plane
19/11/20 18:16:36 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 18:16:36 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 18:16:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 18:16:36 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 18:16:36 INFO sdk_worker.run: Done consuming work.
19/11/20 18:16:36 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 18:16:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 18:16:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 18:16:36 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestKZGyLG/job_f89e68fe-7caf-4fa7-8e03-b566c8a8baed/MANIFEST
19/11/20 18:16:36 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestKZGyLG/job_f89e68fe-7caf-4fa7-8e03-b566c8a8baed/MANIFEST -> 0 artifacts
19/11/20 18:16:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 18:16:37 INFO sdk_worker_main.main: Logging handler created.
19/11/20 18:16:37 INFO sdk_worker_main.start: Status HTTP server running at localhost:45929
19/11/20 18:16:37 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/20 18:16:37 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 18:16:37 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574273791.59_0d4b4012-9da0-4544-8298-4d1be898d48f', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 18:16:37 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574273791.59', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36781'}
19/11/20 18:16:37 INFO statecache.__init__: Creating state cache with size 0
19/11/20 18:16:37 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42109.
19/11/20 18:16:37 INFO sdk_worker.__init__: Control channel established.
19/11/20 18:16:37 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 18:16:37 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/20 18:16:37 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33951.
19/11/20 18:16:37 INFO sdk_worker.create_state_handler: State channel established.
19/11/20 18:16:37 INFO data_plane.create_data_channel: Creating client data channel for localhost:40235
19/11/20 18:16:37 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/20 18:16:37 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 18:16:37 INFO sdk_worker.run: No more requests from control plane
19/11/20 18:16:37 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 18:16:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 18:16:37 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 18:16:37 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 18:16:37 INFO sdk_worker.run: Done consuming work.
19/11/20 18:16:37 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 18:16:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 18:16:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 18:16:37 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574273791.59_0d4b4012-9da0-4544-8298-4d1be898d48f finished.
19/11/20 18:16:37 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/20 18:16:37 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestKZGyLG/job_f89e68fe-7caf-4fa7-8e03-b566c8a8baed/MANIFEST has 0 artifact locations
19/11/20 18:16:37 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestKZGyLG/job_f89e68fe-7caf-4fa7-8e03-b566c8a8baed/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139798457431808)>

# Thread: <Thread(Thread-117, started daemon 139798474217216)>

# Thread: <_MainThread(MainThread, started 139799253317376)>
==================== Timed out after 60 seconds. ====================

    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
# Thread: <Thread(wait_until_finish_read, started daemon 139797830039296)>

    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-123, started daemon 139798447990528)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-117, started daemon 139798474217216)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 139798457431808)>

# Thread: <_MainThread(MainThread, started 139799253317376)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574273781.5_1d32254b-d72e-4703-af24-108efdd6e5d0 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 329.404s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 9s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/sqvpae7wglgbu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1582

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1582/display/redirect?page=changes>

Changes:

[katarzyna.kucharczyk] [BEAM-6335] Added streaming GroupByKey Test that reads SyntheticSource

[katarzyna.kucharczyk] [BEAM-6335] Changed SyntheticDataPublisher to publish String UTF values

[katarzyna.kucharczyk] [BEAM-6335] Added custom PubSub Matcher that stops pipeline after


------------------------------------------
[...truncated 1.32 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 16:12:26 INFO sdk_worker.run: No more requests from control plane
19/11/20 16:12:26 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 16:12:26 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 16:12:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 16:12:26 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 16:12:26 INFO sdk_worker.run: Done consuming work.
19/11/20 16:12:26 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 16:12:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 16:12:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 16:12:26 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestW7ULi4/job_b37a36a1-b132-407c-9d9b-59ed53942602/MANIFEST
19/11/20 16:12:26 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestW7ULi4/job_b37a36a1-b132-407c-9d9b-59ed53942602/MANIFEST -> 0 artifacts
19/11/20 16:12:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 16:12:27 INFO sdk_worker_main.main: Logging handler created.
19/11/20 16:12:27 INFO sdk_worker_main.start: Status HTTP server running at localhost:35045
19/11/20 16:12:27 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/20 16:12:27 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 16:12:27 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574266344.56_674a73e6-484f-44e8-9a8a-8263dc7b59c1', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 16:12:27 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574266344.56', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40225'}
19/11/20 16:12:27 INFO statecache.__init__: Creating state cache with size 0
19/11/20 16:12:27 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34259.
19/11/20 16:12:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/20 16:12:27 INFO sdk_worker.__init__: Control channel established.
19/11/20 16:12:27 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 16:12:27 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36055.
19/11/20 16:12:27 INFO sdk_worker.create_state_handler: State channel established.
19/11/20 16:12:27 INFO data_plane.create_data_channel: Creating client data channel for localhost:33087
19/11/20 16:12:27 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/20 16:12:27 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 16:12:27 INFO sdk_worker.run: No more requests from control plane
19/11/20 16:12:27 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 16:12:27 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 16:12:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 16:12:27 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 16:12:27 INFO sdk_worker.run: Done consuming work.
19/11/20 16:12:27 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 16:12:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 16:12:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 16:12:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestW7ULi4/job_b37a36a1-b132-407c-9d9b-59ed53942602/MANIFEST
19/11/20 16:12:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestW7ULi4/job_b37a36a1-b132-407c-9d9b-59ed53942602/MANIFEST -> 0 artifacts
19/11/20 16:12:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 16:12:27 INFO sdk_worker_main.main: Logging handler created.
19/11/20 16:12:27 INFO sdk_worker_main.start: Status HTTP server running at localhost:33445
19/11/20 16:12:27 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/20 16:12:27 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 16:12:27 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574266344.56_674a73e6-484f-44e8-9a8a-8263dc7b59c1', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 16:12:27 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574266344.56', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40225'}
19/11/20 16:12:27 INFO statecache.__init__: Creating state cache with size 0
19/11/20 16:12:27 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41183.
19/11/20 16:12:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/20 16:12:27 INFO sdk_worker.__init__: Control channel established.
19/11/20 16:12:27 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 16:12:27 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40033.
19/11/20 16:12:27 INFO sdk_worker.create_state_handler: State channel established.
19/11/20 16:12:27 INFO data_plane.create_data_channel: Creating client data channel for localhost:41617
19/11/20 16:12:27 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/20 16:12:28 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 16:12:28 INFO sdk_worker.run: No more requests from control plane
19/11/20 16:12:28 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 16:12:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 16:12:28 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 16:12:28 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 16:12:28 INFO sdk_worker.run: Done consuming work.
19/11/20 16:12:28 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 16:12:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 16:12:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 16:12:28 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestW7ULi4/job_b37a36a1-b132-407c-9d9b-59ed53942602/MANIFEST
19/11/20 16:12:28 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestW7ULi4/job_b37a36a1-b132-407c-9d9b-59ed53942602/MANIFEST -> 0 artifacts
19/11/20 16:12:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 16:12:28 INFO sdk_worker_main.main: Logging handler created.
19/11/20 16:12:28 INFO sdk_worker_main.start: Status HTTP server running at localhost:42061
19/11/20 16:12:28 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/20 16:12:28 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 16:12:28 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574266344.56_674a73e6-484f-44e8-9a8a-8263dc7b59c1', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 16:12:28 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574266344.56', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40225'}
19/11/20 16:12:28 INFO statecache.__init__: Creating state cache with size 0
19/11/20 16:12:28 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35603.
19/11/20 16:12:28 INFO sdk_worker.__init__: Control channel established.
19/11/20 16:12:28 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 16:12:28 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/20 16:12:28 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45559.
19/11/20 16:12:28 INFO sdk_worker.create_state_handler: State channel established.
19/11/20 16:12:28 INFO data_plane.create_data_channel: Creating client data channel for localhost:39969
19/11/20 16:12:28 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/20 16:12:28 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 16:12:28 INFO sdk_worker.run: No more requests from control plane
19/11/20 16:12:28 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 16:12:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 16:12:28 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 16:12:28 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 16:12:28 INFO sdk_worker.run: Done consuming work.
19/11/20 16:12:28 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 16:12:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 16:12:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 16:12:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestW7ULi4/job_b37a36a1-b132-407c-9d9b-59ed53942602/MANIFEST
19/11/20 16:12:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestW7ULi4/job_b37a36a1-b132-407c-9d9b-59ed53942602/MANIFEST -> 0 artifacts
19/11/20 16:12:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 16:12:29 INFO sdk_worker_main.main: Logging handler created.
19/11/20 16:12:29 INFO sdk_worker_main.start: Status HTTP server running at localhost:37131
19/11/20 16:12:29 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/20 16:12:29 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 16:12:29 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574266344.56_674a73e6-484f-44e8-9a8a-8263dc7b59c1', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 16:12:29 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574266344.56', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40225'}
19/11/20 16:12:29 INFO statecache.__init__: Creating state cache with size 0
19/11/20 16:12:29 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43047.
19/11/20 16:12:29 INFO sdk_worker.__init__: Control channel established.
19/11/20 16:12:29 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 16:12:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/20 16:12:29 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41369.
19/11/20 16:12:29 INFO sdk_worker.create_state_handler: State channel established.
19/11/20 16:12:29 INFO data_plane.create_data_channel: Creating client data channel for localhost:37469
19/11/20 16:12:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/20 16:12:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 16:12:29 INFO sdk_worker.run: No more requests from control plane
19/11/20 16:12:29 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 16:12:29 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 16:12:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 16:12:29 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 16:12:29 INFO sdk_worker.run: Done consuming work.
19/11/20 16:12:29 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 16:12:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 16:12:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 16:12:29 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574266344.56_674a73e6-484f-44e8-9a8a-8263dc7b59c1 finished.
19/11/20 16:12:29 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/20 16:12:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestW7ULi4/job_b37a36a1-b132-407c-9d9b-59ed53942602/MANIFEST has 0 artifact locations
19/11/20 16:12:29 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestW7ULi4/job_b37a36a1-b132-407c-9d9b-59ed53942602/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
==================== Timed out after 60 seconds. ====================

    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 140000187152128)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-119, started daemon 140000178759424)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 140000974776064)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 139999685109504)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-125, started daemon 139999693502208)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-119, started daemon 140000178759424)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apach# Thread: <Thread(wait_until_finish_read, started daemon 140000187152128)>

e_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <_MainThread(MainThread, started 140000974776064)>
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574266335.15_dd034f60-ffee-43d9-b383-52d4e21101f7 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 311.035s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 36s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/ridcats5w6vm4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1581

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1581/display/redirect?page=changes>

Changes:

[iemejia] [website] Add Spark Structured Runner VR badge to the github template


------------------------------------------
[...truncated 1.33 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 15:29:17 INFO sdk_worker.run: No more requests from control plane
19/11/20 15:29:17 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 15:29:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 15:29:17 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 15:29:17 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 15:29:17 INFO sdk_worker.run: Done consuming work.
19/11/20 15:29:17 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 15:29:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 15:29:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 15:29:17 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest6jEPN_/job_2077b66a-291a-483a-86ca-67c72da1e3aa/MANIFEST
19/11/20 15:29:17 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest6jEPN_/job_2077b66a-291a-483a-86ca-67c72da1e3aa/MANIFEST -> 0 artifacts
19/11/20 15:29:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 15:29:18 INFO sdk_worker_main.main: Logging handler created.
19/11/20 15:29:18 INFO sdk_worker_main.start: Status HTTP server running at localhost:33255
19/11/20 15:29:18 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/20 15:29:18 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 15:29:18 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574263754.84_ade20626-357b-466c-b102-ad5ae0e97fed', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 15:29:18 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574263754.84', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55261'}
19/11/20 15:29:18 INFO statecache.__init__: Creating state cache with size 0
19/11/20 15:29:18 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46739.
19/11/20 15:29:18 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/20 15:29:18 INFO sdk_worker.__init__: Control channel established.
19/11/20 15:29:18 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 15:29:18 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:32855.
19/11/20 15:29:18 INFO sdk_worker.create_state_handler: State channel established.
19/11/20 15:29:18 INFO data_plane.create_data_channel: Creating client data channel for localhost:40133
19/11/20 15:29:18 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/20 15:29:18 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 15:29:18 INFO sdk_worker.run: No more requests from control plane
19/11/20 15:29:18 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 15:29:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 15:29:18 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 15:29:18 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 15:29:18 INFO sdk_worker.run: Done consuming work.
19/11/20 15:29:18 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 15:29:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 15:29:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 15:29:18 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest6jEPN_/job_2077b66a-291a-483a-86ca-67c72da1e3aa/MANIFEST
19/11/20 15:29:18 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest6jEPN_/job_2077b66a-291a-483a-86ca-67c72da1e3aa/MANIFEST -> 0 artifacts
19/11/20 15:29:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 15:29:19 INFO sdk_worker_main.main: Logging handler created.
19/11/20 15:29:19 INFO sdk_worker_main.start: Status HTTP server running at localhost:40533
19/11/20 15:29:19 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/20 15:29:19 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 15:29:19 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574263754.84_ade20626-357b-466c-b102-ad5ae0e97fed', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 15:29:19 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574263754.84', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55261'}
19/11/20 15:29:19 INFO statecache.__init__: Creating state cache with size 0
19/11/20 15:29:19 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35805.
19/11/20 15:29:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/20 15:29:19 INFO sdk_worker.__init__: Control channel established.
19/11/20 15:29:19 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 15:29:19 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39937.
19/11/20 15:29:19 INFO sdk_worker.create_state_handler: State channel established.
19/11/20 15:29:19 INFO data_plane.create_data_channel: Creating client data channel for localhost:40395
19/11/20 15:29:19 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/20 15:29:19 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 15:29:19 INFO sdk_worker.run: No more requests from control plane
19/11/20 15:29:19 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 15:29:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 15:29:19 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 15:29:19 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 15:29:19 INFO sdk_worker.run: Done consuming work.
19/11/20 15:29:19 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 15:29:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 15:29:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 15:29:19 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest6jEPN_/job_2077b66a-291a-483a-86ca-67c72da1e3aa/MANIFEST
19/11/20 15:29:19 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest6jEPN_/job_2077b66a-291a-483a-86ca-67c72da1e3aa/MANIFEST -> 0 artifacts
19/11/20 15:29:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 15:29:20 INFO sdk_worker_main.main: Logging handler created.
19/11/20 15:29:20 INFO sdk_worker_main.start: Status HTTP server running at localhost:41471
19/11/20 15:29:20 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/20 15:29:20 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 15:29:20 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574263754.84_ade20626-357b-466c-b102-ad5ae0e97fed', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 15:29:20 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574263754.84', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55261'}
19/11/20 15:29:20 INFO statecache.__init__: Creating state cache with size 0
19/11/20 15:29:20 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43295.
19/11/20 15:29:20 INFO sdk_worker.__init__: Control channel established.
19/11/20 15:29:20 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/20 15:29:20 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 15:29:20 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41307.
19/11/20 15:29:20 INFO sdk_worker.create_state_handler: State channel established.
19/11/20 15:29:20 INFO data_plane.create_data_channel: Creating client data channel for localhost:39417
19/11/20 15:29:20 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/20 15:29:20 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 15:29:20 INFO sdk_worker.run: No more requests from control plane
19/11/20 15:29:20 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 15:29:20 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 15:29:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 15:29:20 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 15:29:20 INFO sdk_worker.run: Done consuming work.
19/11/20 15:29:20 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 15:29:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 15:29:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 15:29:20 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest6jEPN_/job_2077b66a-291a-483a-86ca-67c72da1e3aa/MANIFEST
19/11/20 15:29:20 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest6jEPN_/job_2077b66a-291a-483a-86ca-67c72da1e3aa/MANIFEST -> 0 artifacts
19/11/20 15:29:21 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 15:29:21 INFO sdk_worker_main.main: Logging handler created.
19/11/20 15:29:21 INFO sdk_worker_main.start: Status HTTP server running at localhost:45453
19/11/20 15:29:21 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/20 15:29:21 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 15:29:21 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574263754.84_ade20626-357b-466c-b102-ad5ae0e97fed', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 15:29:21 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574263754.84', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55261'}
19/11/20 15:29:21 INFO statecache.__init__: Creating state cache with size 0
19/11/20 15:29:21 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46301.
19/11/20 15:29:21 INFO sdk_worker.__init__: Control channel established.
19/11/20 15:29:21 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 15:29:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/20 15:29:21 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42205.
19/11/20 15:29:21 INFO sdk_worker.create_state_handler: State channel established.
19/11/20 15:29:21 INFO data_plane.create_data_channel: Creating client data channel for localhost:39601
19/11/20 15:29:21 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/20 15:29:21 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 15:29:21 INFO sdk_worker.run: No more requests from control plane
19/11/20 15:29:21 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 15:29:21 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 15:29:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 15:29:21 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 15:29:21 INFO sdk_worker.run: Done consuming work.
19/11/20 15:29:21 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 15:29:21 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 15:29:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 15:29:21 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574263754.84_ade20626-357b-466c-b102-ad5ae0e97fed finished.
19/11/20 15:29:21 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/20 15:29:21 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktest6jEPN_/job_2077b66a-291a-483a-86ca-67c72da1e3aa/MANIFEST has 0 artifact locations
19/11/20 15:29:21 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktest6jEPN_/job_2077b66a-291a-483a-86ca-67c72da1e3aa/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
==================== Timed out after 60 seconds. ====================

    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
# Thread: <Thread(wait_until_finish_read, started daemon 140555841054464)>

    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-120, started daemon 140555857839872)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apach# Thread: <_MainThread(MainThread, started 140556637071104)>
==================== Timed out after 60 seconds. ====================

e_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(wait_until_finish_read, started daemon 140555749156608)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574263740.48_7d045688-2a19-4ddd-a1f0-973b04fb4056 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <Thread(Thread-126, started daemon 140555757549312)>

----------------------------------------------------------------------
Ran 38 tests in 355.672s

# Thread: <_MainThread(MainThread, started 140556637071104)>

FAILED (errors=3, skipped=9)
# Thread: <Thread(Thread-120, started daemon 140555857839872)>

# Thread: <Thread(wait_until_finish_read, started daemon 140555841054464)>

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 25s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://gradle.com/s/q5ao6hobhhng4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1580

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1580/display/redirect?page=changes>

Changes:

[echauchot] [BEAM-8470] Add an empty spark-structured-streaming runner project

[echauchot] [BEAM-8470] Fix missing dep

[echauchot] [BEAM-8470] Add SparkPipelineOptions

[echauchot] [BEAM-8470] Start pipeline translation

[echauchot] [BEAM-8470] Add global pipeline translation structure

[echauchot] [BEAM-8470] Add nodes translators structure

[echauchot] [BEAM-8470] Wire node translators with pipeline translator

[echauchot] [BEAM-8470] Renames: better differenciate pipeline translator for

[echauchot] [BEAM-8470] Organise methods in PipelineTranslator

[echauchot] [BEAM-8470] Initialise BatchTranslationContext

[echauchot] [BEAM-8470] Refactoring: -move batch/streaming common translation

[echauchot] [BEAM-8470] Make transform translation clearer: renaming, comments

[echauchot] [BEAM-8470] Improve javadocs

[echauchot] [BEAM-8470] Move SparkTransformOverrides to correct package

[echauchot] [BEAM-8470] Move common translation context components to superclass

[echauchot] [BEAM-8470] apply spotless

[echauchot] [BEAM-8470] Make codestyle and firebug happy

[echauchot] [BEAM-8470] Add TODOs

[echauchot] [BEAM-8470] Post-pone batch qualifier in all classes names for

[echauchot] [BEAM-8470] Add precise TODO for multiple TransformTranslator per

[echauchot] [BEAM-8470] Added SparkRunnerRegistrar

[echauchot] [BEAM-8470] Add basic pipeline execution. Refactor translatePipeline()

[echauchot] [BEAM-8470] Create PCollections manipulation methods

[echauchot] [BEAM-8470] Create Datasets manipulation methods

[echauchot] [BEAM-8470] Add Flatten transformation translator

[echauchot] [BEAM-8470] Add primitive GroupByKeyTranslatorBatch implementation

[echauchot] [BEAM-8470] Use Iterators.transform() to return Iterable

[echauchot] [BEAM-8470] Implement read transform

[echauchot] [BEAM-8470] update TODO

[echauchot] [BEAM-8470] Apply spotless

[echauchot] [BEAM-8470] start source instanciation

[echauchot] [BEAM-8470] Improve exception flow

[echauchot] [BEAM-8470] Improve type enforcement in ReadSourceTranslator

[echauchot] [BEAM-8470] Experiment over using spark Catalog to pass in Beam Source

[echauchot] [BEAM-8470] Add source mocks

[echauchot] [BEAM-8470] fix mock, wire mock in translators and create a main test.

[echauchot] [BEAM-8470] Use raw WindowedValue so that spark Encoders could work

[echauchot] [BEAM-8470] clean deps

[echauchot] [BEAM-8470] Move DatasetSourceMock to proper batch mode

[echauchot] [BEAM-8470] Run pipeline in batch mode or in streaming mode

[echauchot] [BEAM-8470] Split batch and streaming sources and translators

[echauchot] [BEAM-8470] Use raw Encoder<WindowedValue> also in regular

[echauchot] [BEAM-8470] Clean

[echauchot] [BEAM-8470] Add ReadSourceTranslatorStreaming

[echauchot] [BEAM-8470] Move Source and translator mocks to a mock package.

[echauchot] [BEAM-8470] Pass Beam Source and PipelineOptions to the spark DataSource

[echauchot] [BEAM-8470] Refactor DatasetSource fields

[echauchot] [BEAM-8470] Wire real SourceTransform and not mock and update the test

[echauchot] [BEAM-8470] Add missing 0-arg public constructor

[echauchot] [BEAM-8470] Use new PipelineOptionsSerializationUtils

[echauchot] [BEAM-8470] Apply spotless and fix  checkstyle

[echauchot] [BEAM-8470] Add a dummy schema for reader

[echauchot] [BEAM-8470] Add empty 0-arg constructor for mock source

[echauchot] [BEAM-8470] Clean

[echauchot] [BEAM-8470] Checkstyle and Findbugs

[echauchot] [BEAM-8470] Refactor SourceTest to a UTest instaed of a main

[echauchot] [BEAM-8470] Fix pipeline triggering: use a spark action instead of

[echauchot] [BEAM-8470] improve readability of options passing to the source

[echauchot] [BEAM-8470] Clean unneeded fields in DatasetReader

[echauchot] [BEAM-8470] Fix serialization issues

[echauchot] [BEAM-8470] Add SerializationDebugger

[echauchot] [BEAM-8470] Add serialization test

[echauchot] [BEAM-8470] Move SourceTest to same package as tested class

[echauchot] [BEAM-8470] Fix SourceTest

[echauchot] [BEAM-8470] Simplify beam reader creation as it created once the source

[echauchot] [BEAM-8470] Put all transform translators Serializable

[echauchot] [BEAM-8470] Enable test mode

[echauchot] [BEAM-8470] Enable gradle build scan

[echauchot] [BEAM-8470] Add flatten test

[echauchot] [BEAM-8470] First attempt for ParDo primitive implementation

[echauchot] [BEAM-8470] Serialize windowedValue to byte[] in source to be able to

[echauchot] [BEAM-8470] Comment schema choices

[echauchot] [BEAM-8470] Fix errorprone

[echauchot] [BEAM-8470] Fix testMode output to comply with new binary schema

[echauchot] [BEAM-8470] Cleaning

[echauchot] [BEAM-8470] Remove bundleSize parameter and always use spark default

[echauchot] [BEAM-8470] Fix split bug

[echauchot] [BEAM-8470] Clean

[echauchot] [BEAM-8470] Add ParDoTest

[echauchot] [BEAM-8470] Address minor review notes

[echauchot] [BEAM-8470] Clean

[echauchot] [BEAM-8470] Add GroupByKeyTest

[echauchot] [BEAM-8470] Add comments and TODO to GroupByKeyTranslatorBatch

[echauchot] [BEAM-8470] Fix type checking with Encoder of WindowedValue<T>

[echauchot] [BEAM-8470] Port latest changes of ReadSourceTranslatorBatch to

[echauchot] [BEAM-8470] Remove no more needed putDatasetRaw

[echauchot] [BEAM-8470] Add ComplexSourceTest

[echauchot] [BEAM-8470] Fail in case of having SideInouts or State/Timers

[echauchot] [BEAM-8470] Fix Encoders: create an Encoder for every manipulated type

[echauchot] [BEAM-8470] Apply spotless

[echauchot] [BEAM-8470] Fixed Javadoc error

[echauchot] [BEAM-8470] Rename SparkSideInputReader class and rename pruneOutput()

[echauchot] [BEAM-8470] Don't use deprecated

[echauchot] [BEAM-8470] Simplify logic of ParDo translator

[echauchot] [BEAM-8470] Fix kryo issue in GBK translator with a workaround

[echauchot] [BEAM-8470] Rename SparkOutputManager for consistency

[echauchot] [BEAM-8470] Fix for test elements container in GroupByKeyTest

[echauchot] [BEAM-8470] Added "testTwoPardoInRow"

[echauchot] [BEAM-8470] Add a test for the most simple possible Combine

[echauchot] [BEAM-8470] Rename SparkDoFnFilterFunction to DoFnFilterFunction for

[echauchot] [BEAM-8470] Generalize the use of SerializablePipelineOptions in place

[echauchot] [BEAM-8470] Fix getSideInputs

[echauchot] [BEAM-8470] Extract binary schema creation in a helper class

[echauchot] [BEAM-8470] First version of combinePerKey

[echauchot] [BEAM-8470] Improve type checking of Tuple2 encoder

[echauchot] [BEAM-8470] Introduce WindowingHelpers (and helpers package) and use it

[echauchot] [BEAM-8470] Fix combiner using KV as input, use binary encoders in place

[echauchot] [BEAM-8470] Add combinePerKey and CombineGlobally tests

[echauchot] [BEAM-8470] Introduce RowHelpers

[echauchot] [BEAM-8470] Add CombineGlobally translation to avoid translating

[echauchot] [BEAM-8470] Cleaning

[echauchot] [BEAM-8470] Get back to classes in translators resolution because URNs

[echauchot] [BEAM-8470] Fix various type checking issues in Combine.Globally

[echauchot] [BEAM-8470] Update test with Long

[echauchot] [BEAM-8470] Fix combine. For unknown reason GenericRowWithSchema is used

[echauchot] [BEAM-8470] Use more generic Row instead of GenericRowWithSchema

[echauchot] [BEAM-8470] Add explanation about receiving a Row as input in the

[echauchot] [BEAM-8470] Fix encoder bug in combinePerkey

[echauchot] [BEAM-8470] Cleaning

[echauchot] [BEAM-8470] Implement WindowAssignTranslatorBatch

[echauchot] [BEAM-8470] Implement WindowAssignTest

[echauchot] [BEAM-8470] Fix javadoc

[echauchot] [BEAM-8470] Added SideInput support

[echauchot] [BEAM-8470] Fix CheckStyle violations

[echauchot] [BEAM-8470] Don't use Reshuffle translation

[echauchot] [BEAM-8470] Added using CachedSideInputReader

[echauchot] [BEAM-8470] Added TODO comment for ReshuffleTranslatorBatch

[echauchot] [BEAM-8470] And unchecked warning suppression

[echauchot] [BEAM-8470] Add streaming source initialisation

[echauchot] [BEAM-8470] Implement first streaming source

[echauchot] [BEAM-8470] Add a TODO on spark output modes

[echauchot] [BEAM-8470] Add transformators registry in PipelineTranslatorStreaming

[echauchot] [BEAM-8470] Add source streaming test

[echauchot] [BEAM-8470] Specify checkpointLocation at the pipeline start

[echauchot] [BEAM-8470] Clean unneeded 0 arg constructor in batch source

[echauchot] [BEAM-8470] Clean streaming source

[echauchot] [BEAM-8470] Continue impl of offsets for streaming source

[echauchot] [BEAM-8470] Deal with checkpoint and offset based read

[echauchot] [BEAM-8470] Apply spotless and fix spotbugs warnings

[echauchot] [BEAM-8470] Disable never ending test

[echauchot] [BEAM-8470] Fix access level issues, typos and modernize code to Java 8

[echauchot] [BEAM-8470] Merge Spark Structured Streaming runner into main Spark

[echauchot] [BEAM-8470] Fix non-vendored imports from Spark Streaming Runner classes

[echauchot] [BEAM-8470] Pass doFnSchemaInformation to ParDo batch translation

[echauchot] [BEAM-8470] Fix spotless issues after rebase

[echauchot] [BEAM-8470] Fix logging levels in Spark Structured Streaming translation

[echauchot] [BEAM-8470] Add SparkStructuredStreamingPipelineOptions and

[echauchot] [BEAM-8470] Rename SparkPipelineResult to

[echauchot] [BEAM-8470] Use PAssert in Spark Structured Streaming transform tests

[echauchot] [BEAM-8470] Ignore spark offsets (cf javadoc)

[echauchot] [BEAM-8470] implement source.stop

[echauchot] [BEAM-8470] Update javadoc

[echauchot] [BEAM-8470] Apply Spotless

[echauchot] [BEAM-8470] Enable batch Validates Runner tests for Structured Streaming

[echauchot] [BEAM-8470] Limit the number of partitions to make tests go 300% faster

[echauchot] [BEAM-8470] Fixes ParDo not calling setup and not tearing down if

[echauchot] [BEAM-8470] Pass transform based doFnSchemaInformation in ParDo

[echauchot] [BEAM-8470] Consider null object case on RowHelpers, fixes empty side

[echauchot] [BEAM-8470] Put back batch/simpleSourceTest.testBoundedSource

[echauchot] [BEAM-8470] Update windowAssignTest

[echauchot] [BEAM-8470] Add comment about checkpoint mark

[echauchot] [BEAM-8470] Re-code GroupByKeyTranslatorBatch to conserve windowing

[echauchot] [BEAM-8470] re-enable reduceFnRunner timers for output

[echauchot] [BEAM-8470] Improve visibility of debug messages

[echauchot] [BEAM-8470] Add a test that GBK preserves windowing

[echauchot] [BEAM-8470] Add TODO in Combine translations

[echauchot] [BEAM-8470] Update KVHelpers.extractKey() to deal with WindowedValue and

[echauchot] [BEAM-8470] Fix comment about schemas

[echauchot] [BEAM-8470] Implement reduce part of CombineGlobally translation with

[echauchot] [BEAM-8470] Output data after combine

[echauchot] [BEAM-8470] Implement merge accumulators part of CombineGlobally

[echauchot] [BEAM-8470] Fix encoder in combine call

[echauchot] [BEAM-8470] Revert extractKey while combinePerKey is not done (so that

[echauchot] [BEAM-8470] Apply a groupByKey avoids for some reason that the spark

[echauchot] [BEAM-8470] Fix case when a window does not merge into any other window

[echauchot] [BEAM-8470] Fix wrong encoder in combineGlobally GBK

[echauchot] [BEAM-8470] Fix bug in the window merging logic

[echauchot] [BEAM-8470] Remove the mapPartition that adds a key per partition

[echauchot] [BEAM-8470] Remove CombineGlobally translation because it is less

[echauchot] [BEAM-8470] Now that there is only Combine.PerKey translation, make only

[echauchot] [BEAM-8470] Clean no more needed KVHelpers

[echauchot] [BEAM-8470] Clean not more needed RowHelpers

[echauchot] [BEAM-8470] Clean not more needed WindowingHelpers

[echauchot] [BEAM-8470] Fix javadoc of AggregatorCombiner

[echauchot] [BEAM-8470] Fixed immutable list bug

[echauchot] [BEAM-8470] add comment in combine globally test

[echauchot] [BEAM-8470] Clean groupByKeyTest

[echauchot] [BEAM-8470] Add a test that combine per key preserves windowing

[echauchot] [BEAM-8470] Ignore for now not working test testCombineGlobally

[echauchot] [BEAM-8470] Add metrics support in DoFn

[echauchot] [BEAM-8470] Add missing dependencies to run Spark Structured Streaming

[echauchot] [BEAM-8470] Add setEnableSparkMetricSinks() method

[echauchot] [BEAM-8470] Fix javadoc

[echauchot] [BEAM-8470] Fix accumulators initialization in Combine that prevented

[echauchot] [BEAM-8470] Add a test to check that CombineGlobally preserves windowing

[echauchot] [BEAM-8470] Persist all output Dataset if there are multiple outputs in

[echauchot] [BEAM-8470] Added metrics sinks and tests

[echauchot] [BEAM-8470] Make spotless happy

[echauchot] [BEAM-8470] Add PipelineResults to Spark structured streaming.

[echauchot] [BEAM-8470] Update log4j configuration

[echauchot] [BEAM-8470] Add spark execution plans extended debug messages.

[echauchot] [BEAM-8470] Print number of leaf datasets

[echauchot] [BEAM-8470] fixup! Add PipelineResults to Spark structured streaming.

[echauchot] [BEAM-8470] Remove no more needed AggregatorCombinerPerKey (there is

[echauchot] [BEAM-8470] After testing performance and correctness, launch pipeline

[echauchot] [BEAM-8470] Improve Pardo translation performance: avoid calling a

[echauchot] [BEAM-8470] Use "sparkMaster" in local mode to obtain number of shuffle

[echauchot] [BEAM-8470] Wrap Beam Coders into Spark Encoders using

[echauchot] [BEAM-8470] Wrap Beam Coders into Spark Encoders using

[echauchot] [BEAM-8470] type erasure: spark encoders require a Class<T>, pass Object

[echauchot] [BEAM-8470] Fix scala Product in Encoders to avoid StackEverflow

[echauchot] [BEAM-8470] Conform to spark ExpressionEncoders: pass classTags,

[echauchot] [BEAM-8470] Add a simple spark native test to test Beam coders wrapping

[echauchot] [BEAM-8470] Fix code generation in Beam coder wrapper

[echauchot] [BEAM-8470] Lazy init coder because coder instance cannot be

[echauchot] [BEAM-8470] Fix warning in coder construction by reflexion

[echauchot] [BEAM-8470] Fix ExpressionEncoder generated code: typos, try catch, fqcn

[echauchot] [BEAM-8470] Fix getting the output value in code generation

[echauchot] [BEAM-8470] Fix beam coder lazy init using reflexion: use .clas + try

[echauchot] [BEAM-8470] Remove lazy init of beam coder because there is no generic

[echauchot] [BEAM-8470] Remove example code

[echauchot] [BEAM-8470] Fix equal and hashcode

[echauchot] [BEAM-8470] Fix generated code: uniform exceptions catching, fix

[echauchot] [BEAM-8470] Add an assert of equality in the encoders test

[echauchot] [BEAM-8470] Apply spotless and checkstyle and add javadocs

[echauchot] [BEAM-8470] Wrap exceptions in UserCoderExceptions

[echauchot] [BEAM-8470] Put Encoders expressions serializable

[echauchot] [BEAM-8470] Catch Exception instead of IOException because some coders

[echauchot] [BEAM-8470] Apply new Encoders to CombinePerKey

[echauchot] [BEAM-8470] Apply new Encoders to Read source

[echauchot] [BEAM-8470] Improve performance of source: the mapper already calls

[echauchot] [BEAM-8470] Ignore long time failing test: SparkMetricsSinkTest

[echauchot] [BEAM-8470] Apply new Encoders to Window assign translation

[echauchot] [BEAM-8470] Apply new Encoders to AggregatorCombiner

[echauchot] [BEAM-8470] Create a Tuple2Coder to encode scala tuple2

[echauchot] [BEAM-8470] Apply new Encoders to GroupByKey

[echauchot] [BEAM-8470] Apply new Encoders to Pardo. Replace Tuple2Coder with

[echauchot] [BEAM-8470] Apply spotless, fix typo and javadoc

[echauchot] [BEAM-8470] Use beam encoders also in the output of the source

[echauchot] [BEAM-8470] Remove unneeded cast

[echauchot] [BEAM-8470] Fix: Remove generic hack of using object. Use actual Coder

[echauchot] [BEAM-8470] Remove Encoders based on kryo now that we call Beam coders

[echauchot] [BEAM-8470] Add a jenkins job for validates runner tests in the new

[echauchot] [BEAM-8470] Apply spotless

[echauchot] [BEAM-8470] Rebase on master: pass sideInputMapping in SimpleDoFnRunner

[echauchot] Fix SpotBugs

[echauchot] [BEAM-8470] simplify coders in combinePerKey translation

[echauchot] [BEAM-8470] Fix combiner. Do not reuse instance of accumulator

[echauchot] [BEAM-8470] input windows can arrive exploded (for sliding windows). As

[echauchot] [BEAM-8470] Add a combine test with sliding windows

[echauchot] [BEAM-8470] Add a test to test combine translation on binaryCombineFn

[echauchot] [BEAM-8470] Fix tests: use correct

[echauchot] [BEAM-8470] Fix wrong expected results in

[echauchot] [BEAM-8470] Add disclaimers about this runner being experimental

[echauchot] [BEAM-8470] Fix: create an empty accumulator in

[echauchot] [BEAM-8470] Apply spotless

[echauchot] [BEAM-8470] Add a countPerElement test with sliding windows

[echauchot] [BEAM-8470] Fix the output timestamps of combine: timestamps must be

[echauchot] [BEAM-8470] set log level to info to avoid resource consumption in

[echauchot] [BEAM-8470] Fix CombineTest.testCountPerElementWithSlidingWindows

[aromanenko.dev] [BEAM-8470] Remove "validatesStructuredStreamingRunnerBatch" from

[echauchot] [BEAM-8470] Fix timestamps in combine output: assign the timestamp to


------------------------------------------
[...truncated 1.32 MB...]
19/11/20 14:38:35 INFO sdk_worker.create_state_handler: State channel established.
19/11/20 14:38:35 INFO data_plane.create_data_channel: Creating client data channel for localhost:46283
19/11/20 14:38:35 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/20 14:38:35 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 14:38:35 INFO sdk_worker.run: No more requests from control plane
19/11/20 14:38:35 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 14:38:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 14:38:35 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 14:38:35 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 14:38:35 INFO sdk_worker.run: Done consuming work.
19/11/20 14:38:35 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 14:38:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 14:38:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 14:38:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestM8epv3/job_a170b361-0556-4262-8ba2-aeaec0641ddd/MANIFEST
19/11/20 14:38:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestM8epv3/job_a170b361-0556-4262-8ba2-aeaec0641ddd/MANIFEST -> 0 artifacts
19/11/20 14:38:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 14:38:36 INFO sdk_worker_main.main: Logging handler created.
19/11/20 14:38:36 INFO sdk_worker_main.start: Status HTTP server running at localhost:34981
19/11/20 14:38:36 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/20 14:38:36 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 14:38:36 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574260714.18_ce6f6c60-4aee-43c2-a2d6-5dd7ae57b385', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 14:38:36 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574260714.18', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:33243'}
19/11/20 14:38:36 INFO statecache.__init__: Creating state cache with size 0
19/11/20 14:38:36 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36079.
19/11/20 14:38:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/20 14:38:36 INFO sdk_worker.__init__: Control channel established.
19/11/20 14:38:36 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 14:38:36 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34975.
19/11/20 14:38:36 INFO sdk_worker.create_state_handler: State channel established.
19/11/20 14:38:36 INFO data_plane.create_data_channel: Creating client data channel for localhost:40539
19/11/20 14:38:36 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/20 14:38:36 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 14:38:36 INFO sdk_worker.run: No more requests from control plane
19/11/20 14:38:36 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 14:38:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 14:38:36 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 14:38:36 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 14:38:36 INFO sdk_worker.run: Done consuming work.
19/11/20 14:38:36 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 14:38:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 14:38:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 14:38:36 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestM8epv3/job_a170b361-0556-4262-8ba2-aeaec0641ddd/MANIFEST
19/11/20 14:38:36 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestM8epv3/job_a170b361-0556-4262-8ba2-aeaec0641ddd/MANIFEST -> 0 artifacts
19/11/20 14:38:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 14:38:37 INFO sdk_worker_main.main: Logging handler created.
19/11/20 14:38:37 INFO sdk_worker_main.start: Status HTTP server running at localhost:46867
19/11/20 14:38:37 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/20 14:38:37 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 14:38:37 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574260714.18_ce6f6c60-4aee-43c2-a2d6-5dd7ae57b385', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 14:38:37 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574260714.18', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:33243'}
19/11/20 14:38:37 INFO statecache.__init__: Creating state cache with size 0
19/11/20 14:38:37 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44281.
19/11/20 14:38:37 INFO sdk_worker.__init__: Control channel established.
19/11/20 14:38:37 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 14:38:37 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/20 14:38:37 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44181.
19/11/20 14:38:37 INFO sdk_worker.create_state_handler: State channel established.
19/11/20 14:38:37 INFO data_plane.create_data_channel: Creating client data channel for localhost:40627
19/11/20 14:38:37 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/20 14:38:37 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 14:38:37 INFO sdk_worker.run: No more requests from control plane
19/11/20 14:38:37 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 14:38:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 14:38:37 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 14:38:37 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 14:38:37 INFO sdk_worker.run: Done consuming work.
19/11/20 14:38:37 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 14:38:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 14:38:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 14:38:37 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestM8epv3/job_a170b361-0556-4262-8ba2-aeaec0641ddd/MANIFEST
19/11/20 14:38:37 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestM8epv3/job_a170b361-0556-4262-8ba2-aeaec0641ddd/MANIFEST -> 0 artifacts
19/11/20 14:38:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 14:38:38 INFO sdk_worker_main.main: Logging handler created.
19/11/20 14:38:38 INFO sdk_worker_main.start: Status HTTP server running at localhost:44003
19/11/20 14:38:38 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/20 14:38:38 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 14:38:38 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574260714.18_ce6f6c60-4aee-43c2-a2d6-5dd7ae57b385', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 14:38:38 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574260714.18', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:33243'}
19/11/20 14:38:38 INFO statecache.__init__: Creating state cache with size 0
19/11/20 14:38:38 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37061.
19/11/20 14:38:38 INFO sdk_worker.__init__: Control channel established.
19/11/20 14:38:38 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 14:38:38 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/20 14:38:38 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40607.
19/11/20 14:38:38 INFO sdk_worker.create_state_handler: State channel established.
19/11/20 14:38:38 INFO data_plane.create_data_channel: Creating client data channel for localhost:36721
19/11/20 14:38:38 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/20 14:38:38 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 14:38:38 INFO sdk_worker.run: No more requests from control plane
19/11/20 14:38:38 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 14:38:38 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 14:38:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 14:38:38 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 14:38:38 INFO sdk_worker.run: Done consuming work.
19/11/20 14:38:38 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 14:38:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 14:38:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 14:38:38 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestM8epv3/job_a170b361-0556-4262-8ba2-aeaec0641ddd/MANIFEST
19/11/20 14:38:38 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestM8epv3/job_a170b361-0556-4262-8ba2-aeaec0641ddd/MANIFEST -> 0 artifacts
19/11/20 14:38:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 14:38:39 INFO sdk_worker_main.main: Logging handler created.
19/11/20 14:38:39 INFO sdk_worker_main.start: Status HTTP server running at localhost:44923
19/11/20 14:38:39 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/20 14:38:39 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 14:38:39 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574260714.18_ce6f6c60-4aee-43c2-a2d6-5dd7ae57b385', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 14:38:39 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574260714.18', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:33243'}
19/11/20 14:38:39 INFO statecache.__init__: Creating state cache with size 0
19/11/20 14:38:39 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38011.
19/11/20 14:38:39 INFO sdk_worker.__init__: Control channel established.
19/11/20 14:38:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/20 14:38:39 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 14:38:39 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41833.
19/11/20 14:38:39 INFO sdk_worker.create_state_handler: State channel established.
19/11/20 14:38:39 INFO data_plane.create_data_channel: Creating client data channel for localhost:37195
19/11/20 14:38:39 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/20 14:38:39 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 14:38:39 INFO sdk_worker.run: No more requests from control plane
19/11/20 14:38:39 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/20 14:38:39 INFO data_plane.close: Closing all cached grpc data channels.
19/11/20 14:38:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 14:38:39 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/20 14:38:39 INFO sdk_worker.run: Done consuming work.
19/11/20 14:38:39 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/20 14:38:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/20 14:38:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 14:38:39 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574260714.18_ce6f6c60-4aee-43c2-a2d6-5dd7ae57b385 finished.
19/11/20 14:38:39 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/20 14:38:39 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestM8epv3/job_a170b361-0556-4262-8ba2-aeaec0641ddd/MANIFEST has 0 artifact locations
19/11/20 14:38:39 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestM8epv3/job_a170b361-0556-4262-8ba2-aeaec0641ddd/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140314589488896)>

    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(Thread-118, started daemon 140314597881600)>

  File "apache_beam/runners/portability/portable_ru# Thread: <_MainThread(MainThread, started 140315379283712)>
==================== Timed out after 60 seconds. ====================

nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 140314098530048)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-124, started daemon 140314090137344)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 140315379283712)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574260705.52_39edadb2-ae7d-49b9-afe2-79d8bc21cf60 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 326.382s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 29s
60 actionable tasks: 52 executed, 8 from cache

Publishing build scan...
https://gradle.com/s/admroodybd366

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1579

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1579/display/redirect>

Changes:


------------------------------------------
[...truncated 1.67 MB...]
19/11/20 12:12:40 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/20 12:12:40 INFO DAGScheduler: failed: Set()
19/11/20 12:12:40 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/20 12:12:40 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/20 12:12:40 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.1 KB, free 13.5 GB)
19/11/20 12:12:40 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:38565 (size: 22.1 KB, free: 13.5 GB)
19/11/20 12:12:40 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/20 12:12:40 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/20 12:12:40 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/20 12:12:40 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 159, localhost, executor driver, partition 0, NODE_LOCAL, 7760 bytes)
19/11/20 12:12:40 INFO Executor: Running task 0.0 in stage 132.0 (TID 159)
19/11/20 12:12:40 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/20 12:12:40 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/20 12:12:40 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4z3k22/job_d960caa0-65fe-4794-af47-330ec7b79b13/MANIFEST
19/11/20 12:12:40 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4z3k22/job_d960caa0-65fe-4794-af47-330ec7b79b13/MANIFEST -> 0 artifacts
19/11/20 12:12:41 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 12:12:41 INFO main: Logging handler created.
19/11/20 12:12:41 INFO start: Status HTTP server running at localhost:45163
19/11/20 12:12:41 INFO main: semi_persistent_directory: /tmp
19/11/20 12:12:41 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 12:12:41 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574251958.25_c38fb241-20bb-442f-b26e-125ca5b5ab22', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 12:12:41 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574251958.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57067'}
19/11/20 12:12:41 INFO __init__: Creating state cache with size 0
19/11/20 12:12:41 INFO __init__: Creating insecure control channel for localhost:45901.
19/11/20 12:12:41 INFO __init__: Control channel established.
19/11/20 12:12:41 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/20 12:12:41 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 12:12:41 INFO create_state_handler: Creating insecure state channel for localhost:42159.
19/11/20 12:12:41 INFO create_state_handler: State channel established.
19/11/20 12:12:41 INFO create_data_channel: Creating client data channel for localhost:34461
19/11/20 12:12:41 INFO GrpcDataService: Beam Fn Data client connected.
19/11/20 12:12:41 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 12:12:41 INFO run: No more requests from control plane
19/11/20 12:12:41 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/20 12:12:41 INFO close: Closing all cached grpc data channels.
19/11/20 12:12:41 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 12:12:41 INFO close: Closing all cached gRPC state handlers.
19/11/20 12:12:41 INFO run: Done consuming work.
19/11/20 12:12:41 INFO main: Python sdk harness exiting.
19/11/20 12:12:41 INFO GrpcLoggingService: Logging client hanged up.
19/11/20 12:12:41 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 12:12:41 INFO Executor: Finished task 0.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/20 12:12:41 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 160, localhost, executor driver, partition 1, PROCESS_LOCAL, 7977 bytes)
19/11/20 12:12:41 INFO Executor: Running task 1.0 in stage 132.0 (TID 160)
19/11/20 12:12:41 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 159) in 873 ms on localhost (executor driver) (1/2)
19/11/20 12:12:41 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4z3k22/job_d960caa0-65fe-4794-af47-330ec7b79b13/MANIFEST
19/11/20 12:12:41 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4z3k22/job_d960caa0-65fe-4794-af47-330ec7b79b13/MANIFEST -> 0 artifacts
19/11/20 12:12:42 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 12:12:42 INFO main: Logging handler created.
19/11/20 12:12:42 INFO start: Status HTTP server running at localhost:33653
19/11/20 12:12:42 INFO main: semi_persistent_directory: /tmp
19/11/20 12:12:42 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 12:12:42 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574251958.25_c38fb241-20bb-442f-b26e-125ca5b5ab22', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 12:12:42 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574251958.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57067'}
19/11/20 12:12:42 INFO __init__: Creating state cache with size 0
19/11/20 12:12:42 INFO __init__: Creating insecure control channel for localhost:42111.
19/11/20 12:12:42 INFO __init__: Control channel established.
19/11/20 12:12:42 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/20 12:12:42 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 12:12:42 INFO create_state_handler: Creating insecure state channel for localhost:34409.
19/11/20 12:12:42 INFO create_state_handler: State channel established.
19/11/20 12:12:42 INFO create_data_channel: Creating client data channel for localhost:45309
19/11/20 12:12:42 INFO GrpcDataService: Beam Fn Data client connected.
19/11/20 12:12:42 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 12:12:42 INFO run: No more requests from control plane
19/11/20 12:12:42 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/20 12:12:42 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 12:12:42 INFO close: Closing all cached grpc data channels.
19/11/20 12:12:42 INFO close: Closing all cached gRPC state handlers.
19/11/20 12:12:42 INFO run: Done consuming work.
19/11/20 12:12:42 INFO main: Python sdk harness exiting.
19/11/20 12:12:42 INFO GrpcLoggingService: Logging client hanged up.
19/11/20 12:12:42 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 12:12:42 INFO Executor: Finished task 1.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/20 12:12:42 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 160) in 812 ms on localhost (executor driver) (2/2)
19/11/20 12:12:42 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/20 12:12:42 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.691 s
19/11/20 12:12:42 INFO DAGScheduler: looking for newly runnable stages
19/11/20 12:12:42 INFO DAGScheduler: running: Set()
19/11/20 12:12:42 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/20 12:12:42 INFO DAGScheduler: failed: Set()
19/11/20 12:12:42 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/20 12:12:42 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/20 12:12:42 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.4 KB, free 13.5 GB)
19/11/20 12:12:42 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:38565 (size: 12.4 KB, free: 13.5 GB)
19/11/20 12:12:42 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/20 12:12:42 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/20 12:12:42 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/20 12:12:42 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/20 12:12:42 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/20 12:12:42 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/20 12:12:42 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/20 12:12:42 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4z3k22/job_d960caa0-65fe-4794-af47-330ec7b79b13/MANIFEST
19/11/20 12:12:42 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4z3k22/job_d960caa0-65fe-4794-af47-330ec7b79b13/MANIFEST -> 0 artifacts
19/11/20 12:12:43 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 12:12:43 INFO main: Logging handler created.
19/11/20 12:12:43 INFO start: Status HTTP server running at localhost:46503
19/11/20 12:12:43 INFO main: semi_persistent_directory: /tmp
19/11/20 12:12:43 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 12:12:43 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574251958.25_c38fb241-20bb-442f-b26e-125ca5b5ab22', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 12:12:43 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574251958.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57067'}
19/11/20 12:12:43 INFO __init__: Creating state cache with size 0
19/11/20 12:12:43 INFO __init__: Creating insecure control channel for localhost:35703.
19/11/20 12:12:43 INFO __init__: Control channel established.
19/11/20 12:12:43 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 12:12:43 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/20 12:12:43 INFO create_state_handler: Creating insecure state channel for localhost:44007.
19/11/20 12:12:43 INFO create_state_handler: State channel established.
19/11/20 12:12:43 INFO create_data_channel: Creating client data channel for localhost:36835
19/11/20 12:12:43 INFO GrpcDataService: Beam Fn Data client connected.
19/11/20 12:12:43 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 12:12:43 INFO run: No more requests from control plane
19/11/20 12:12:43 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/20 12:12:43 INFO close: Closing all cached grpc data channels.
19/11/20 12:12:43 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 12:12:43 INFO close: Closing all cached gRPC state handlers.
19/11/20 12:12:43 INFO run: Done consuming work.
19/11/20 12:12:43 INFO main: Python sdk harness exiting.
19/11/20 12:12:43 INFO GrpcLoggingService: Logging client hanged up.
19/11/20 12:12:43 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 12:12:43 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/20 12:12:43 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 809 ms on localhost (executor driver) (1/1)
19/11/20 12:12:43 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/20 12:12:43 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.815 s
19/11/20 12:12:43 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.125154 s
19/11/20 12:12:43 INFO SparkPipelineRunner: Job test_windowing_1574251958.25_c38fb241-20bb-442f-b26e-125ca5b5ab22 finished.
19/11/20 12:12:43 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/20 12:12:43 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktest4z3k22/job_d960caa0-65fe-4794-af47-330ec7b79b13/MANIFEST has 0 artifact locations
19/11/20 12:12:43 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktest4z3k22/job_d960caa0-65fe-4794-af47-330ec7b79b13/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140321695659776)>

    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 32# Thread: <Thread(Thread-118, started daemon 140321704052480)>

8, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <_MainThread(MainThread, started 140322491676416)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 140321203676928)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
# Thread: <Thread(Thread-123, started daemon 140321212069632)>

    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 140322491676416)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 140321695659776)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(Thread-118, started daemon 140321704052480)>
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574251949.5_e1b55901-a7f4-4306-9988-86bfdc4bca85 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 302.691s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 47s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/fq2g2urfdxnta

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1578

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1578/display/redirect>

Changes:


------------------------------------------
[...truncated 1.77 MB...]
# Thread: <Thread(wait_until_finish_read, started daemon 139816435947264)>

# Thread: <Thread(wait_until_finish_read, started daemon 139815538386688)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139816486303488)>

# Thread: <Thread(Thread-144, started daemon 139816575923968)>

# Thread: <Thread(wait_until_finish_read, started daemon 139816452732672)>

# Thread: <Thread(wait_until_finish_read, started daemon 139815521601280)>

# Thread: <Thread(Thread-136, started daemon 139815529993984)>

# Thread: <Thread(Thread-123, started daemon 139816461125376)>

# Thread: <Thread(Thread-128, started daemon 139816444339968)>

# Thread: <_MainThread(MainThread, started 139817355024128)>

# Thread: <Thread(wait_until_finish_read, started daemon 139816435947264)>

======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_unfusable_side_inputs (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 244, in test_pardo_unfusable_side_inputs
    equal_to([('a', 'a'), ('a', 'b'), ('b', 'a'), ('b', 'b')]))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_windowed_side_inputs (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 181, in test_pardo_windowed_side_inputs
    label='windowed')
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_read (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 578, in test_read
    equal_to(['a', 'b', 'c']))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_reshuffle (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 548, in test_reshuffle
    equal_to([1, 2, 3]))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_check_done_failed (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 470, in test_sdf_with_check_done_failed
    | beam.ParDo(ExpandingStringsDoFn()))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

----------------------------------------------------------------------
Ran 38 tests in 804.442s

FAILED (errors=8, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 16m 7s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/cac4c7ydacmfi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1577

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1577/display/redirect?page=changes>

Changes:

[robertwb] More compartmentalization of bundle-based-runner only utilities.


------------------------------------------
[...truncated 1.65 MB...]
19/11/20 01:16:29 INFO main: Logging handler created.
19/11/20 01:16:29 INFO start: Status HTTP server running at localhost:43241
19/11/20 01:16:29 INFO main: semi_persistent_directory: /tmp
19/11/20 01:16:29 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 01:16:29 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574212586.72_785b8dfd-5286-41fd-89ee-998f1f3b6cc6', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 01:16:29 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574212586.72', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42579'}
19/11/20 01:16:29 INFO __init__: Creating state cache with size 0
19/11/20 01:16:29 INFO __init__: Creating insecure control channel for localhost:33535.
19/11/20 01:16:29 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/20 01:16:29 INFO __init__: Control channel established.
19/11/20 01:16:29 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 01:16:29 INFO create_state_handler: Creating insecure state channel for localhost:35797.
19/11/20 01:16:29 INFO create_state_handler: State channel established.
19/11/20 01:16:29 INFO create_data_channel: Creating client data channel for localhost:44353
19/11/20 01:16:29 INFO GrpcDataService: Beam Fn Data client connected.
19/11/20 01:16:29 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/20 01:16:29 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/20 01:16:29 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 01:16:29 INFO run: No more requests from control plane
19/11/20 01:16:29 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/20 01:16:29 INFO close: Closing all cached grpc data channels.
19/11/20 01:16:29 INFO close: Closing all cached gRPC state handlers.
19/11/20 01:16:29 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 01:16:29 INFO run: Done consuming work.
19/11/20 01:16:29 INFO main: Python sdk harness exiting.
19/11/20 01:16:29 INFO GrpcLoggingService: Logging client hanged up.
19/11/20 01:16:29 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 01:16:29 INFO Executor: Finished task 0.0 in stage 131.0 (TID 158). 12763 bytes result sent to driver
19/11/20 01:16:29 INFO TaskSetManager: Finished task 0.0 in stage 131.0 (TID 158) in 847 ms on localhost (executor driver) (1/1)
19/11/20 01:16:29 INFO TaskSchedulerImpl: Removed TaskSet 131.0, whose tasks have all completed, from pool 
19/11/20 01:16:29 INFO DAGScheduler: ShuffleMapStage 131 (mapToPair at GroupCombineFunctions.java:55) finished in 0.853 s
19/11/20 01:16:29 INFO DAGScheduler: looking for newly runnable stages
19/11/20 01:16:29 INFO DAGScheduler: running: Set()
19/11/20 01:16:29 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/20 01:16:29 INFO DAGScheduler: failed: Set()
19/11/20 01:16:29 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/20 01:16:29 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/20 01:16:29 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.9 KB, free 13.5 GB)
19/11/20 01:16:29 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:36007 (size: 22.9 KB, free: 13.5 GB)
19/11/20 01:16:29 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/20 01:16:29 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/20 01:16:29 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/20 01:16:29 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 159, localhost, executor driver, partition 1, NODE_LOCAL, 7760 bytes)
19/11/20 01:16:29 INFO Executor: Running task 1.0 in stage 132.0 (TID 159)
19/11/20 01:16:29 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/20 01:16:29 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/20 01:16:29 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestBL1WFM/job_5344d60b-bad5-4d7c-b90f-8dddcdc4bb86/MANIFEST
19/11/20 01:16:29 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestBL1WFM/job_5344d60b-bad5-4d7c-b90f-8dddcdc4bb86/MANIFEST -> 0 artifacts
19/11/20 01:16:30 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 01:16:30 INFO main: Logging handler created.
19/11/20 01:16:30 INFO start: Status HTTP server running at localhost:44031
19/11/20 01:16:30 INFO main: semi_persistent_directory: /tmp
19/11/20 01:16:30 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 01:16:30 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574212586.72_785b8dfd-5286-41fd-89ee-998f1f3b6cc6', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 01:16:30 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574212586.72', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42579'}
19/11/20 01:16:30 INFO __init__: Creating state cache with size 0
19/11/20 01:16:30 INFO __init__: Creating insecure control channel for localhost:41365.
19/11/20 01:16:30 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/20 01:16:30 INFO __init__: Control channel established.
19/11/20 01:16:30 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 01:16:30 INFO create_state_handler: Creating insecure state channel for localhost:35319.
19/11/20 01:16:30 INFO create_state_handler: State channel established.
19/11/20 01:16:30 INFO create_data_channel: Creating client data channel for localhost:44723
19/11/20 01:16:30 INFO GrpcDataService: Beam Fn Data client connected.
19/11/20 01:16:30 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 01:16:30 INFO run: No more requests from control plane
19/11/20 01:16:30 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/20 01:16:30 INFO close: Closing all cached grpc data channels.
19/11/20 01:16:30 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 01:16:30 INFO close: Closing all cached gRPC state handlers.
19/11/20 01:16:30 INFO run: Done consuming work.
19/11/20 01:16:30 INFO main: Python sdk harness exiting.
19/11/20 01:16:30 INFO GrpcLoggingService: Logging client hanged up.
19/11/20 01:16:30 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 01:16:30 INFO Executor: Finished task 1.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/20 01:16:30 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 160, localhost, executor driver, partition 0, PROCESS_LOCAL, 7977 bytes)
19/11/20 01:16:30 INFO Executor: Running task 0.0 in stage 132.0 (TID 160)
19/11/20 01:16:30 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 159) in 871 ms on localhost (executor driver) (1/2)
19/11/20 01:16:30 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestBL1WFM/job_5344d60b-bad5-4d7c-b90f-8dddcdc4bb86/MANIFEST
19/11/20 01:16:30 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestBL1WFM/job_5344d60b-bad5-4d7c-b90f-8dddcdc4bb86/MANIFEST -> 0 artifacts
19/11/20 01:16:30 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 01:16:30 INFO main: Logging handler created.
19/11/20 01:16:30 INFO start: Status HTTP server running at localhost:43587
19/11/20 01:16:30 INFO main: semi_persistent_directory: /tmp
19/11/20 01:16:30 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 01:16:30 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574212586.72_785b8dfd-5286-41fd-89ee-998f1f3b6cc6', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 01:16:30 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574212586.72', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42579'}
19/11/20 01:16:30 INFO __init__: Creating state cache with size 0
19/11/20 01:16:30 INFO __init__: Creating insecure control channel for localhost:39631.
19/11/20 01:16:30 INFO __init__: Control channel established.
19/11/20 01:16:30 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 01:16:30 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/20 01:16:30 INFO create_state_handler: Creating insecure state channel for localhost:40443.
19/11/20 01:16:30 INFO create_state_handler: State channel established.
19/11/20 01:16:30 INFO create_data_channel: Creating client data channel for localhost:46317
19/11/20 01:16:30 INFO GrpcDataService: Beam Fn Data client connected.
19/11/20 01:16:30 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 01:16:30 INFO run: No more requests from control plane
19/11/20 01:16:30 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/20 01:16:30 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 01:16:30 INFO close: Closing all cached grpc data channels.
19/11/20 01:16:30 INFO close: Closing all cached gRPC state handlers.
19/11/20 01:16:30 INFO run: Done consuming work.
19/11/20 01:16:30 INFO main: Python sdk harness exiting.
19/11/20 01:16:30 INFO GrpcLoggingService: Logging client hanged up.
19/11/20 01:16:31 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 01:16:31 INFO Executor: Finished task 0.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/20 01:16:31 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 160) in 811 ms on localhost (executor driver) (2/2)
19/11/20 01:16:31 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/20 01:16:31 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.688 s
19/11/20 01:16:31 INFO DAGScheduler: looking for newly runnable stages
19/11/20 01:16:31 INFO DAGScheduler: running: Set()
19/11/20 01:16:31 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/20 01:16:31 INFO DAGScheduler: failed: Set()
19/11/20 01:16:31 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/20 01:16:31 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/20 01:16:31 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.3 KB, free 13.5 GB)
19/11/20 01:16:31 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:36007 (size: 12.3 KB, free: 13.5 GB)
19/11/20 01:16:31 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/20 01:16:31 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/20 01:16:31 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/20 01:16:31 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/20 01:16:31 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/20 01:16:31 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/20 01:16:31 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
19/11/20 01:16:31 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestBL1WFM/job_5344d60b-bad5-4d7c-b90f-8dddcdc4bb86/MANIFEST
19/11/20 01:16:31 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestBL1WFM/job_5344d60b-bad5-4d7c-b90f-8dddcdc4bb86/MANIFEST -> 0 artifacts
19/11/20 01:16:31 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/20 01:16:31 INFO main: Logging handler created.
19/11/20 01:16:31 INFO start: Status HTTP server running at localhost:34759
19/11/20 01:16:31 INFO main: semi_persistent_directory: /tmp
19/11/20 01:16:31 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/20 01:16:31 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574212586.72_785b8dfd-5286-41fd-89ee-998f1f3b6cc6', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/20 01:16:31 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574212586.72', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42579'}
19/11/20 01:16:31 INFO __init__: Creating state cache with size 0
19/11/20 01:16:31 INFO __init__: Creating insecure control channel for localhost:42405.
19/11/20 01:16:31 INFO __init__: Control channel established.
19/11/20 01:16:31 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/20 01:16:31 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/20 01:16:31 INFO create_state_handler: Creating insecure state channel for localhost:39065.
19/11/20 01:16:31 INFO create_state_handler: State channel established.
19/11/20 01:16:31 INFO create_data_channel: Creating client data channel for localhost:33611
19/11/20 01:16:31 INFO GrpcDataService: Beam Fn Data client connected.
19/11/20 01:16:31 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/20 01:16:31 INFO run: No more requests from control plane
19/11/20 01:16:31 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/20 01:16:31 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 01:16:31 INFO close: Closing all cached grpc data channels.
19/11/20 01:16:31 INFO close: Closing all cached gRPC state handlers.
19/11/20 01:16:31 INFO run: Done consuming work.
19/11/20 01:16:31 INFO main: Python sdk harness exiting.
19/11/20 01:16:31 INFO GrpcLoggingService: Logging client hanged up.
19/11/20 01:16:31 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/20 01:16:31 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 12013 bytes result sent to driver
19/11/20 01:16:31 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 832 ms on localhost (executor driver) (1/1)
19/11/20 01:16:31 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/20 01:16:31 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.838 s
19/11/20 01:16:31 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.173053 s
19/11/20 01:16:31 INFO SparkPipelineRunner: Job test_windowing_1574212586.72_785b8dfd-5286-41fd-89ee-998f1f3b6cc6 finished.
19/11/20 01:16:31 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/20 01:16:31 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestBL1WFM/job_5344d60b-bad5-4d7c-b90f-8dddcdc4bb86/MANIFEST has 0 artifact locations
19/11/20 01:16:31 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestBL1WFM/job_5344d60b-bad5-4d7c-b90f-8dddcdc4bb86/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
==================== Timed out after 60 seconds. ====================
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574212577.7_e71335a6-05d5-4de6-ac23-2693426f58bc failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <Thread(wait_until_finish_read, started daemon 140180627441408)>

----------------------------------------------------------------------
Ran 38 tests in 276.835s

# Thread: <Thread(Thread-119, started daemon 140180015019776)>

FAILED (errors=2, skipped=9)
# Thread: <_MainThread(MainThread, started 140181142832896)>

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 10s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/ihedqla7vg4ns

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1576

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1576/display/redirect?page=changes>

Changes:

[robertwb] [BEAM-8335] Adds the StreamingCache (#10119)

[github] [BEAM-8151] Further cleanup of SDK Workers. (#10134)


------------------------------------------
[...truncated 1.66 MB...]
19/11/19 23:44:25 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/19 23:44:25 INFO DAGScheduler: failed: Set()
19/11/19 23:44:25 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/19 23:44:25 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/19 23:44:25 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.9 KB, free 13.5 GB)
19/11/19 23:44:25 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:39291 (size: 22.9 KB, free: 13.5 GB)
19/11/19 23:44:25 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/19 23:44:25 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/19 23:44:25 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/19 23:44:25 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 159, localhost, executor driver, partition 1, NODE_LOCAL, 7760 bytes)
19/11/19 23:44:25 INFO Executor: Running task 1.0 in stage 132.0 (TID 159)
19/11/19 23:44:25 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/19 23:44:25 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/19 23:44:25 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestqrgpQH/job_a6815061-3011-47c5-9d8a-905ec1c375e5/MANIFEST
19/11/19 23:44:25 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestqrgpQH/job_a6815061-3011-47c5-9d8a-905ec1c375e5/MANIFEST -> 0 artifacts
19/11/19 23:44:25 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/19 23:44:26 INFO main: Logging handler created.
19/11/19 23:44:26 INFO start: Status HTTP server running at localhost:40423
19/11/19 23:44:26 INFO main: semi_persistent_directory: /tmp
19/11/19 23:44:26 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/19 23:44:26 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574207062.55_8116a49d-e3be-4704-8372-67df56d78a4d', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/19 23:44:26 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574207062.55', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:33965'}
19/11/19 23:44:26 INFO __init__: Creating state cache with size 0
19/11/19 23:44:26 INFO __init__: Creating insecure control channel for localhost:41453.
19/11/19 23:44:26 INFO __init__: Control channel established.
19/11/19 23:44:26 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/19 23:44:26 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/19 23:44:26 INFO create_state_handler: Creating insecure state channel for localhost:35027.
19/11/19 23:44:26 INFO create_state_handler: State channel established.
19/11/19 23:44:26 INFO create_data_channel: Creating client data channel for localhost:35155
19/11/19 23:44:26 INFO GrpcDataService: Beam Fn Data client connected.
19/11/19 23:44:26 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/19 23:44:26 INFO run: No more requests from control plane
19/11/19 23:44:26 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 23:44:26 INFO close: Closing all cached grpc data channels.
19/11/19 23:44:26 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 23:44:26 INFO close: Closing all cached gRPC state handlers.
19/11/19 23:44:26 INFO run: Done consuming work.
19/11/19 23:44:26 INFO main: Python sdk harness exiting.
19/11/19 23:44:26 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 23:44:26 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 23:44:26 INFO Executor: Finished task 1.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/19 23:44:26 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 160, localhost, executor driver, partition 0, PROCESS_LOCAL, 7977 bytes)
19/11/19 23:44:26 INFO Executor: Running task 0.0 in stage 132.0 (TID 160)
19/11/19 23:44:26 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 159) in 903 ms on localhost (executor driver) (1/2)
19/11/19 23:44:26 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestqrgpQH/job_a6815061-3011-47c5-9d8a-905ec1c375e5/MANIFEST
19/11/19 23:44:26 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestqrgpQH/job_a6815061-3011-47c5-9d8a-905ec1c375e5/MANIFEST -> 0 artifacts
19/11/19 23:44:26 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/19 23:44:26 INFO main: Logging handler created.
19/11/19 23:44:26 INFO start: Status HTTP server running at localhost:35613
19/11/19 23:44:26 INFO main: semi_persistent_directory: /tmp
19/11/19 23:44:26 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/19 23:44:26 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574207062.55_8116a49d-e3be-4704-8372-67df56d78a4d', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/19 23:44:26 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574207062.55', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:33965'}
19/11/19 23:44:26 INFO __init__: Creating state cache with size 0
19/11/19 23:44:26 INFO __init__: Creating insecure control channel for localhost:35711.
19/11/19 23:44:26 INFO __init__: Control channel established.
19/11/19 23:44:26 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/19 23:44:26 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/19 23:44:26 INFO create_state_handler: Creating insecure state channel for localhost:38545.
19/11/19 23:44:26 INFO create_state_handler: State channel established.
19/11/19 23:44:26 INFO create_data_channel: Creating client data channel for localhost:41019
19/11/19 23:44:26 INFO GrpcDataService: Beam Fn Data client connected.
19/11/19 23:44:26 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/19 23:44:26 INFO run: No more requests from control plane
19/11/19 23:44:26 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 23:44:26 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 23:44:26 INFO close: Closing all cached grpc data channels.
19/11/19 23:44:26 INFO close: Closing all cached gRPC state handlers.
19/11/19 23:44:26 INFO run: Done consuming work.
19/11/19 23:44:26 INFO main: Python sdk harness exiting.
19/11/19 23:44:26 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 23:44:27 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 23:44:27 INFO Executor: Finished task 0.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/19 23:44:27 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 160) in 824 ms on localhost (executor driver) (2/2)
19/11/19 23:44:27 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/19 23:44:27 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.732 s
19/11/19 23:44:27 INFO DAGScheduler: looking for newly runnable stages
19/11/19 23:44:27 INFO DAGScheduler: running: Set()
19/11/19 23:44:27 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/19 23:44:27 INFO DAGScheduler: failed: Set()
19/11/19 23:44:27 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/19 23:44:27 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/19 23:44:27 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.3 KB, free 13.5 GB)
19/11/19 23:44:27 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:39291 (size: 12.3 KB, free: 13.5 GB)
19/11/19 23:44:27 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/19 23:44:27 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/19 23:44:27 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/19 23:44:27 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/19 23:44:27 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/19 23:44:27 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/19 23:44:27 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/19 23:44:27 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestqrgpQH/job_a6815061-3011-47c5-9d8a-905ec1c375e5/MANIFEST
19/11/19 23:44:27 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestqrgpQH/job_a6815061-3011-47c5-9d8a-905ec1c375e5/MANIFEST -> 0 artifacts
19/11/19 23:44:27 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/19 23:44:27 INFO main: Logging handler created.
19/11/19 23:44:27 INFO start: Status HTTP server running at localhost:36183
19/11/19 23:44:27 INFO main: semi_persistent_directory: /tmp
19/11/19 23:44:27 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/19 23:44:27 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574207062.55_8116a49d-e3be-4704-8372-67df56d78a4d', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/19 23:44:27 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574207062.55', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:33965'}
19/11/19 23:44:27 INFO __init__: Creating state cache with size 0
19/11/19 23:44:27 INFO __init__: Creating insecure control channel for localhost:38471.
19/11/19 23:44:27 INFO __init__: Control channel established.
19/11/19 23:44:27 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/19 23:44:27 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/19 23:44:27 INFO create_state_handler: Creating insecure state channel for localhost:44987.
19/11/19 23:44:27 INFO create_state_handler: State channel established.
19/11/19 23:44:27 INFO create_data_channel: Creating client data channel for localhost:39497
19/11/19 23:44:27 INFO GrpcDataService: Beam Fn Data client connected.
19/11/19 23:44:27 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/19 23:44:27 INFO run: No more requests from control plane
19/11/19 23:44:27 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 23:44:27 INFO close: Closing all cached grpc data channels.
19/11/19 23:44:27 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 23:44:27 INFO close: Closing all cached gRPC state handlers.
19/11/19 23:44:27 INFO run: Done consuming work.
19/11/19 23:44:27 INFO main: Python sdk harness exiting.
19/11/19 23:44:27 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 23:44:27 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 23:44:27 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/19 23:44:27 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 814 ms on localhost (executor driver) (1/1)
19/11/19 23:44:27 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/19 23:44:27 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.820 s
19/11/19 23:44:27 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.245154 s
19/11/19 23:44:27 INFO SparkPipelineRunner: Job test_windowing_1574207062.55_8116a49d-e3be-4704-8372-67df56d78a4d finished.
19/11/19 23:44:27 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/19 23:44:27 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestqrgpQH/job_a6815061-3011-47c5-9d8a-905ec1c375e5/MANIFEST has 0 artifact locations
19/11/19 23:44:27 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestqrgpQH/job_a6815061-3011-47c5-9d8a-905ec1c375e5/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140413205436160)>

ERROR: test_pardo_timers (__main__.SparkRunnerTest)
# Thread: <Thread(Thread-119, started daemon 140413213828864)>

----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <_MainThread(MainThread, started 140414000391936)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 140413194946304)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-125, started daemon 140413186553600)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 140414000391936)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 140413205436160)>

# Thread: <Thread(Thread-119, started daemon 140413213828864)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574207053.64_ccd46617-61a0-4434-8b02-7112dea0a00c failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 295.196s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 24s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/vomfalc7o3who

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1575

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1575/display/redirect?page=changes>

Changes:

[suztomo] empty commit

[suztomo] Revert empty line

[suztomo] Fixed typo

[suztomo] Correct value is 'Triage Needed'

[suztomo] Reverting unnecessary changes


------------------------------------------
[...truncated 1.68 MB...]
19/11/19 22:18:11 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/19 22:18:11 INFO DAGScheduler: failed: Set()
19/11/19 22:18:11 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/19 22:18:11 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.4 GB)
19/11/19 22:18:11 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.0 KB, free 13.4 GB)
19/11/19 22:18:11 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:40625 (size: 22.0 KB, free: 13.4 GB)
19/11/19 22:18:11 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/19 22:18:11 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/19 22:18:11 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/19 22:18:11 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 159, localhost, executor driver, partition 0, NODE_LOCAL, 7760 bytes)
19/11/19 22:18:11 INFO Executor: Running task 0.0 in stage 132.0 (TID 159)
19/11/19 22:18:11 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/19 22:18:11 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/19 22:18:11 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest0Sx8TT/job_eb4e4ab1-fda4-4d99-80f4-a0d9a1643404/MANIFEST
19/11/19 22:18:11 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest0Sx8TT/job_eb4e4ab1-fda4-4d99-80f4-a0d9a1643404/MANIFEST -> 0 artifacts
19/11/19 22:18:12 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/19 22:18:12 INFO main: Logging handler created.
19/11/19 22:18:12 INFO start: Status HTTP server running at localhost:42557
19/11/19 22:18:12 INFO main: semi_persistent_directory: /tmp
19/11/19 22:18:12 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/19 22:18:12 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574201888.39_69ae1f99-1b93-4fb7-b209-7fc4fd17f415', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/19 22:18:12 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574201888.39', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35121'}
19/11/19 22:18:12 INFO __init__: Creating state cache with size 0
19/11/19 22:18:12 INFO __init__: Creating insecure control channel for localhost:41547.
19/11/19 22:18:12 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/19 22:18:12 INFO __init__: Control channel established.
19/11/19 22:18:12 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/19 22:18:12 INFO create_state_handler: Creating insecure state channel for localhost:32813.
19/11/19 22:18:12 INFO create_state_handler: State channel established.
19/11/19 22:18:12 INFO create_data_channel: Creating client data channel for localhost:34349
19/11/19 22:18:12 INFO GrpcDataService: Beam Fn Data client connected.
19/11/19 22:18:12 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/19 22:18:12 INFO run: No more requests from control plane
19/11/19 22:18:12 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 22:18:12 INFO close: Closing all cached grpc data channels.
19/11/19 22:18:12 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 22:18:12 INFO close: Closing all cached gRPC state handlers.
19/11/19 22:18:12 INFO run: Done consuming work.
19/11/19 22:18:12 INFO main: Python sdk harness exiting.
19/11/19 22:18:12 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 22:18:12 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 22:18:12 INFO Executor: Finished task 0.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/19 22:18:12 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 160, localhost, executor driver, partition 1, PROCESS_LOCAL, 7977 bytes)
19/11/19 22:18:12 INFO Executor: Running task 1.0 in stage 132.0 (TID 160)
19/11/19 22:18:12 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 159) in 1077 ms on localhost (executor driver) (1/2)
19/11/19 22:18:12 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest0Sx8TT/job_eb4e4ab1-fda4-4d99-80f4-a0d9a1643404/MANIFEST
19/11/19 22:18:12 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest0Sx8TT/job_eb4e4ab1-fda4-4d99-80f4-a0d9a1643404/MANIFEST -> 0 artifacts
19/11/19 22:18:13 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/19 22:18:13 INFO main: Logging handler created.
19/11/19 22:18:13 INFO start: Status HTTP server running at localhost:44601
19/11/19 22:18:13 INFO main: semi_persistent_directory: /tmp
19/11/19 22:18:13 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/19 22:18:13 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574201888.39_69ae1f99-1b93-4fb7-b209-7fc4fd17f415', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/19 22:18:13 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574201888.39', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35121'}
19/11/19 22:18:13 INFO __init__: Creating state cache with size 0
19/11/19 22:18:13 INFO __init__: Creating insecure control channel for localhost:35921.
19/11/19 22:18:13 INFO __init__: Control channel established.
19/11/19 22:18:13 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/19 22:18:13 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/19 22:18:13 INFO create_state_handler: Creating insecure state channel for localhost:45695.
19/11/19 22:18:13 INFO create_state_handler: State channel established.
19/11/19 22:18:13 INFO create_data_channel: Creating client data channel for localhost:33303
19/11/19 22:18:13 INFO GrpcDataService: Beam Fn Data client connected.
19/11/19 22:18:13 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/19 22:18:13 INFO run: No more requests from control plane
19/11/19 22:18:13 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 22:18:13 INFO close: Closing all cached grpc data channels.
19/11/19 22:18:13 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 22:18:13 INFO close: Closing all cached gRPC state handlers.
19/11/19 22:18:13 INFO run: Done consuming work.
19/11/19 22:18:13 INFO main: Python sdk harness exiting.
19/11/19 22:18:13 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 22:18:13 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 22:18:13 INFO Executor: Finished task 1.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/19 22:18:13 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 160) in 995 ms on localhost (executor driver) (2/2)
19/11/19 22:18:13 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/19 22:18:13 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 2.080 s
19/11/19 22:18:13 INFO DAGScheduler: looking for newly runnable stages
19/11/19 22:18:13 INFO DAGScheduler: running: Set()
19/11/19 22:18:13 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/19 22:18:13 INFO DAGScheduler: failed: Set()
19/11/19 22:18:13 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/19 22:18:13 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.4 GB)
19/11/19 22:18:13 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.4 KB, free 13.4 GB)
19/11/19 22:18:13 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:40625 (size: 12.4 KB, free: 13.4 GB)
19/11/19 22:18:13 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/19 22:18:13 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/19 22:18:13 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/19 22:18:13 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/19 22:18:13 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/19 22:18:13 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/19 22:18:13 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
19/11/19 22:18:13 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest0Sx8TT/job_eb4e4ab1-fda4-4d99-80f4-a0d9a1643404/MANIFEST
19/11/19 22:18:13 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest0Sx8TT/job_eb4e4ab1-fda4-4d99-80f4-a0d9a1643404/MANIFEST -> 0 artifacts
19/11/19 22:18:14 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/19 22:18:14 INFO main: Logging handler created.
19/11/19 22:18:14 INFO start: Status HTTP server running at localhost:41947
19/11/19 22:18:14 INFO main: semi_persistent_directory: /tmp
19/11/19 22:18:14 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/19 22:18:14 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574201888.39_69ae1f99-1b93-4fb7-b209-7fc4fd17f415', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/19 22:18:14 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574201888.39', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35121'}
19/11/19 22:18:14 INFO __init__: Creating state cache with size 0
19/11/19 22:18:14 INFO __init__: Creating insecure control channel for localhost:45767.
19/11/19 22:18:14 INFO __init__: Control channel established.
19/11/19 22:18:14 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/19 22:18:14 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/19 22:18:14 INFO create_state_handler: Creating insecure state channel for localhost:45311.
19/11/19 22:18:14 INFO create_state_handler: State channel established.
19/11/19 22:18:14 INFO create_data_channel: Creating client data channel for localhost:40755
19/11/19 22:18:14 INFO GrpcDataService: Beam Fn Data client connected.
19/11/19 22:18:14 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/19 22:18:14 INFO run: No more requests from control plane
19/11/19 22:18:14 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 22:18:14 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 22:18:14 INFO close: Closing all cached grpc data channels.
19/11/19 22:18:14 INFO close: Closing all cached gRPC state handlers.
19/11/19 22:18:14 INFO run: Done consuming work.
19/11/19 22:18:14 INFO main: Python sdk harness exiting.
19/11/19 22:18:14 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 22:18:14 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 22:18:14 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/19 22:18:14 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 1016 ms on localhost (executor driver) (1/1)
19/11/19 22:18:14 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/19 22:18:14 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 1.023 s
19/11/19 22:18:14 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 5.161838 s
19/11/19 22:18:14 INFO SparkPipelineRunner: Job test_windowing_1574201888.39_69ae1f99-1b93-4fb7-b209-7fc4fd17f415 finished.
19/11/19 22:18:14 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/19 22:18:14 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktest0Sx8TT/job_eb4e4ab1-fda4-4d99-80f4-a0d9a1643404/MANIFEST has 0 artifact locations
19/11/19 22:18:14 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktest0Sx8TT/job_eb4e4ab1-fda4-4d99-80f4-a0d9a1643404/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140488276932352)>

BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
# Thread: <Thread(Thread-119, started daemon 140488285325056)>

# Thread: <_MainThread(MainThread, started 140489072948992)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apach# Thread: <Thread(wait_until_finish_read, started daemon 140488258836224)>

e_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(Thread-125, started daemon 140488267491072)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574201877.3_f36a713b-1f1a-45de-9243-6804968d606c failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <Thread(Thread-119, started daemon 140488285325056)>

----------------------------------------------------------------------
Ran 38 tests in 344.497s

# Thread: <_MainThread(MainThread, started 140489072948992)>

FAILED (errors=3, skipped=9)
# Thread: <Thread(wait_until_finish_read, started daemon 140488276932352)>

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 21s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/ppgl3fy4jmsym

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1574

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1574/display/redirect?page=changes>

Changes:

[markliu] Add Java mobile game on DirectRunner to release script


------------------------------------------
[...truncated 1.67 MB...]
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2503
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2564
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2517
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2750
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2951
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2709
19/11/19 21:44:21 INFO ContextCleaner: Cleaned shuffle 70
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2549
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2674
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2874
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2948
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2646
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2862
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2641
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2818
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2809
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2643
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2877
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2528
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2553
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2794
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2676
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2819
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2931
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2616
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2886
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2537
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2584
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2649
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2519
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2797
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2495
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2602
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2532
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2619
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2585
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2827
19/11/19 21:44:21 INFO BlockManagerInfo: Removed broadcast_123_piece0 on localhost:46667 in memory (size: 9.1 KB, free: 13.4 GB)
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2724
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2546
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2591
19/11/19 21:44:21 INFO BlockManagerInfo: Removed broadcast_110_piece0 on localhost:46667 in memory (size: 20.0 KB, free: 13.4 GB)
19/11/19 21:44:21 INFO BlockManagerInfo: Removed broadcast_127_piece0 on localhost:46667 in memory (size: 9.0 KB, free: 13.4 GB)
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2838
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2702
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2628
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2721
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2670
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2876
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2719
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2605
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2592
19/11/19 21:44:21 INFO ContextCleaner: Cleaned accumulator 2516
19/11/19 21:44:22 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/19 21:44:22 INFO main: Logging handler created.
19/11/19 21:44:22 INFO start: Status HTTP server running at localhost:38733
19/11/19 21:44:22 INFO main: semi_persistent_directory: /tmp
19/11/19 21:44:22 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/19 21:44:22 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574199858.45_c4096c27-03b9-4992-8e7d-a6825cb62561', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/19 21:44:22 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574199858.45', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42293'}
19/11/19 21:44:22 INFO __init__: Creating state cache with size 0
19/11/19 21:44:22 INFO __init__: Creating insecure control channel for localhost:33233.
19/11/19 21:44:22 INFO __init__: Control channel established.
19/11/19 21:44:22 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/19 21:44:22 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/19 21:44:22 INFO create_state_handler: Creating insecure state channel for localhost:45173.
19/11/19 21:44:22 INFO create_state_handler: State channel established.
19/11/19 21:44:22 INFO create_data_channel: Creating client data channel for localhost:42093
19/11/19 21:44:22 INFO GrpcDataService: Beam Fn Data client connected.
19/11/19 21:44:22 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/19 21:44:22 INFO run: No more requests from control plane
19/11/19 21:44:22 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 21:44:22 INFO close: Closing all cached grpc data channels.
19/11/19 21:44:22 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 21:44:22 INFO close: Closing all cached gRPC state handlers.
19/11/19 21:44:22 INFO run: Done consuming work.
19/11/19 21:44:22 INFO main: Python sdk harness exiting.
19/11/19 21:44:22 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 21:44:22 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 21:44:22 INFO Executor: Finished task 0.0 in stage 132.0 (TID 160). 13753 bytes result sent to driver
19/11/19 21:44:22 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 160) in 841 ms on localhost (executor driver) (2/2)
19/11/19 21:44:22 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/19 21:44:22 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.689 s
19/11/19 21:44:22 INFO DAGScheduler: looking for newly runnable stages
19/11/19 21:44:22 INFO DAGScheduler: running: Set()
19/11/19 21:44:22 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/19 21:44:22 INFO DAGScheduler: failed: Set()
19/11/19 21:44:22 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/19 21:44:22 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.4 GB)
19/11/19 21:44:22 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.3 KB, free 13.4 GB)
19/11/19 21:44:22 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:46667 (size: 12.3 KB, free: 13.4 GB)
19/11/19 21:44:22 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/19 21:44:22 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/19 21:44:22 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/19 21:44:22 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/19 21:44:22 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/19 21:44:22 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/19 21:44:22 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/19 21:44:22 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestLS7Nrc/job_2c02e26e-2572-4604-89f2-d028581e9cd3/MANIFEST
19/11/19 21:44:22 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestLS7Nrc/job_2c02e26e-2572-4604-89f2-d028581e9cd3/MANIFEST -> 0 artifacts
19/11/19 21:44:23 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/19 21:44:23 INFO main: Logging handler created.
19/11/19 21:44:23 INFO start: Status HTTP server running at localhost:46581
19/11/19 21:44:23 INFO main: semi_persistent_directory: /tmp
19/11/19 21:44:23 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/19 21:44:23 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574199858.45_c4096c27-03b9-4992-8e7d-a6825cb62561', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/19 21:44:23 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574199858.45', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42293'}
19/11/19 21:44:23 INFO __init__: Creating state cache with size 0
19/11/19 21:44:23 INFO __init__: Creating insecure control channel for localhost:43199.
19/11/19 21:44:23 INFO __init__: Control channel established.
19/11/19 21:44:23 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/19 21:44:23 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/19 21:44:23 INFO create_state_handler: Creating insecure state channel for localhost:46151.
19/11/19 21:44:23 INFO create_state_handler: State channel established.
19/11/19 21:44:23 INFO create_data_channel: Creating client data channel for localhost:42407
19/11/19 21:44:23 INFO GrpcDataService: Beam Fn Data client connected.
19/11/19 21:44:23 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/19 21:44:23 INFO run: No more requests from control plane
19/11/19 21:44:23 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 21:44:23 INFO close: Closing all cached grpc data channels.
19/11/19 21:44:23 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 21:44:23 INFO close: Closing all cached gRPC state handlers.
19/11/19 21:44:23 INFO run: Done consuming work.
19/11/19 21:44:23 INFO main: Python sdk harness exiting.
19/11/19 21:44:23 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 21:44:23 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 21:44:23 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/19 21:44:23 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 801 ms on localhost (executor driver) (1/1)
19/11/19 21:44:23 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/19 21:44:23 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.806 s
19/11/19 21:44:23 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.090454 s
19/11/19 21:44:23 INFO SparkPipelineRunner: Job test_windowing_1574199858.45_c4096c27-03b9-4992-8e7d-a6825cb62561 finished.
19/11/19 21:44:23 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/19 21:44:23 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestLS7Nrc/job_2c02e26e-2572-4604-89f2-d028581e9cd3/MANIFEST has 0 artifact locations
19/11/19 21:44:23 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestLS7Nrc/job_2c02e26e-2572-4604-89f2-d028581e9cd3/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
==================== Timed out after 60 seconds. ====================
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140177968228096)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-118, started daemon 140177959835392)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
# Thread: <_MainThread(MainThread, started 140178755852032)>
==================== Timed out after 60 seconds. ====================

ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140177330665216)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(lis# Thread: <Thread(Thread-124, started daemon 140177950394112)>

t(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <_MainThread(MainThread, started 140178755852032)>
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574199849.8_26c7b82f-58bc-4569-8ba8-19efd9bed87b failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 288.053s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 6s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/qmtdo6oakmfv2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1573

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1573/display/redirect?page=changes>

Changes:

[robertwb] [BEAM-8575] Test flatten a single PC and test flatten a flattened PC

[lukecwik] Allow metrics update to be tolerant to uninitalized metric containers.

[lukecwik] [BEAM-3493] Prevent users from "implementing" PipelineOptions. (#10005)

[robertwb] [BEAM-8645] Create a py test case for Re-iteration on GBK result. 


------------------------------------------
[...truncated 1.67 MB...]
19/11/19 20:23:33 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/19 20:23:33 INFO DAGScheduler: failed: Set()
19/11/19 20:23:33 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/19 20:23:33 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/19 20:23:33 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.1 KB, free 13.5 GB)
19/11/19 20:23:33 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:42447 (size: 22.1 KB, free: 13.5 GB)
19/11/19 20:23:33 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/19 20:23:33 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/19 20:23:33 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/19 20:23:33 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 159, localhost, executor driver, partition 0, NODE_LOCAL, 7760 bytes)
19/11/19 20:23:33 INFO Executor: Running task 0.0 in stage 132.0 (TID 159)
19/11/19 20:23:33 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/19 20:23:33 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/19 20:23:33 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest1EnLEj/job_db32469c-f2a6-487a-b528-6f4c19e67f2a/MANIFEST
19/11/19 20:23:33 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest1EnLEj/job_db32469c-f2a6-487a-b528-6f4c19e67f2a/MANIFEST -> 0 artifacts
19/11/19 20:23:34 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/19 20:23:34 INFO main: Logging handler created.
19/11/19 20:23:34 INFO start: Status HTTP server running at localhost:43903
19/11/19 20:23:34 INFO main: semi_persistent_directory: /tmp
19/11/19 20:23:34 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/19 20:23:34 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574195011.21_99c32279-5c48-4b66-80b3-e5fe227bc632', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/19 20:23:34 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574195011.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:38529'}
19/11/19 20:23:34 INFO __init__: Creating state cache with size 0
19/11/19 20:23:34 INFO __init__: Creating insecure control channel for localhost:45361.
19/11/19 20:23:34 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/19 20:23:34 INFO __init__: Control channel established.
19/11/19 20:23:34 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/19 20:23:34 INFO create_state_handler: Creating insecure state channel for localhost:41255.
19/11/19 20:23:34 INFO create_state_handler: State channel established.
19/11/19 20:23:34 INFO create_data_channel: Creating client data channel for localhost:46363
19/11/19 20:23:34 INFO GrpcDataService: Beam Fn Data client connected.
19/11/19 20:23:34 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/19 20:23:34 INFO run: No more requests from control plane
19/11/19 20:23:34 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 20:23:34 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 20:23:34 INFO close: Closing all cached grpc data channels.
19/11/19 20:23:34 INFO close: Closing all cached gRPC state handlers.
19/11/19 20:23:34 INFO run: Done consuming work.
19/11/19 20:23:34 INFO main: Python sdk harness exiting.
19/11/19 20:23:34 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 20:23:34 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 20:23:34 INFO Executor: Finished task 0.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/19 20:23:34 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 160, localhost, executor driver, partition 1, PROCESS_LOCAL, 7977 bytes)
19/11/19 20:23:34 INFO Executor: Running task 1.0 in stage 132.0 (TID 160)
19/11/19 20:23:34 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 159) in 963 ms on localhost (executor driver) (1/2)
19/11/19 20:23:34 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest1EnLEj/job_db32469c-f2a6-487a-b528-6f4c19e67f2a/MANIFEST
19/11/19 20:23:34 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest1EnLEj/job_db32469c-f2a6-487a-b528-6f4c19e67f2a/MANIFEST -> 0 artifacts
19/11/19 20:23:35 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/19 20:23:35 INFO main: Logging handler created.
19/11/19 20:23:35 INFO start: Status HTTP server running at localhost:34113
19/11/19 20:23:35 INFO main: semi_persistent_directory: /tmp
19/11/19 20:23:35 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/19 20:23:35 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574195011.21_99c32279-5c48-4b66-80b3-e5fe227bc632', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/19 20:23:35 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574195011.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:38529'}
19/11/19 20:23:35 INFO __init__: Creating state cache with size 0
19/11/19 20:23:35 INFO __init__: Creating insecure control channel for localhost:44627.
19/11/19 20:23:35 INFO __init__: Control channel established.
19/11/19 20:23:35 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/19 20:23:35 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/19 20:23:35 INFO create_state_handler: Creating insecure state channel for localhost:45871.
19/11/19 20:23:35 INFO create_state_handler: State channel established.
19/11/19 20:23:35 INFO create_data_channel: Creating client data channel for localhost:40655
19/11/19 20:23:35 INFO GrpcDataService: Beam Fn Data client connected.
19/11/19 20:23:35 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/19 20:23:35 INFO run: No more requests from control plane
19/11/19 20:23:35 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 20:23:35 INFO close: Closing all cached grpc data channels.
19/11/19 20:23:35 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 20:23:35 INFO close: Closing all cached gRPC state handlers.
19/11/19 20:23:35 INFO run: Done consuming work.
19/11/19 20:23:35 INFO main: Python sdk harness exiting.
19/11/19 20:23:35 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 20:23:35 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 20:23:35 INFO Executor: Finished task 1.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/19 20:23:35 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 160) in 821 ms on localhost (executor driver) (2/2)
19/11/19 20:23:35 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/19 20:23:35 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.790 s
19/11/19 20:23:35 INFO DAGScheduler: looking for newly runnable stages
19/11/19 20:23:35 INFO DAGScheduler: running: Set()
19/11/19 20:23:35 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/19 20:23:35 INFO DAGScheduler: failed: Set()
19/11/19 20:23:35 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/19 20:23:35 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/19 20:23:35 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.4 KB, free 13.5 GB)
19/11/19 20:23:35 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:42447 (size: 12.4 KB, free: 13.5 GB)
19/11/19 20:23:35 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/19 20:23:35 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/19 20:23:35 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/19 20:23:35 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/19 20:23:35 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/19 20:23:35 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/19 20:23:35 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/19 20:23:35 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest1EnLEj/job_db32469c-f2a6-487a-b528-6f4c19e67f2a/MANIFEST
19/11/19 20:23:35 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest1EnLEj/job_db32469c-f2a6-487a-b528-6f4c19e67f2a/MANIFEST -> 0 artifacts
19/11/19 20:23:36 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/19 20:23:36 INFO main: Logging handler created.
19/11/19 20:23:36 INFO start: Status HTTP server running at localhost:39381
19/11/19 20:23:36 INFO main: semi_persistent_directory: /tmp
19/11/19 20:23:36 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/19 20:23:36 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574195011.21_99c32279-5c48-4b66-80b3-e5fe227bc632', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/19 20:23:36 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574195011.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:38529'}
19/11/19 20:23:36 INFO __init__: Creating state cache with size 0
19/11/19 20:23:36 INFO __init__: Creating insecure control channel for localhost:45841.
19/11/19 20:23:36 INFO __init__: Control channel established.
19/11/19 20:23:36 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/19 20:23:36 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/19 20:23:36 INFO create_state_handler: Creating insecure state channel for localhost:36885.
19/11/19 20:23:36 INFO create_state_handler: State channel established.
19/11/19 20:23:36 INFO create_data_channel: Creating client data channel for localhost:33961
19/11/19 20:23:36 INFO GrpcDataService: Beam Fn Data client connected.
19/11/19 20:23:36 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/19 20:23:36 INFO run: No more requests from control plane
19/11/19 20:23:36 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 20:23:36 INFO close: Closing all cached grpc data channels.
19/11/19 20:23:36 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 20:23:36 INFO close: Closing all cached gRPC state handlers.
19/11/19 20:23:36 INFO run: Done consuming work.
19/11/19 20:23:36 INFO main: Python sdk harness exiting.
19/11/19 20:23:36 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 20:23:36 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 20:23:36 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/19 20:23:36 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 835 ms on localhost (executor driver) (1/1)
19/11/19 20:23:36 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/19 20:23:36 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.842 s
19/11/19 20:23:36 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.362683 s
19/11/19 20:23:36 INFO SparkPipelineRunner: Job test_windowing_1574195011.21_99c32279-5c48-4b66-80b3-e5fe227bc632 finished.
19/11/19 20:23:36 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/19 20:23:36 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktest1EnLEj/job_db32469c-f2a6-487a-b528-6f4c19e67f2a/MANIFEST has 0 artifact locations
19/11/19 20:23:36 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktest1EnLEj/job_db32469c-f2a6-487a-b528-6f4c19e67f2a/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140461764572928)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
# Thread: <Thread(Thread-119, started daemon 140461493401344)>

Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 140462279964416)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 140460991440640)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-125, started daemon 140460999833344)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-119, started daemon 140461493401344)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(wait_until_finish_read, started daemon 140461764572928)>

# Thread: <_MainThread(MainThread, started 140462279964416)>
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574195001.96_5555d6a0-777c-4fb8-9278-f3a46d04839c failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 304.140s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 35s
59 actionable tasks: 47 executed, 12 from cache

Publishing build scan...
https://gradle.com/s/ngtyqa7c3yaco

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1572

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1572/display/redirect?page=changes>

Changes:

[robertwb] [BEAM-7473] Pack RangeTracker into restriction (#10118)


------------------------------------------
[...truncated 1.67 MB...]
19/11/19 18:39:35 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/19 18:39:35 INFO DAGScheduler: failed: Set()
19/11/19 18:39:35 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/19 18:39:35 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/19 18:39:35 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.0 KB, free 13.5 GB)
19/11/19 18:39:35 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:33891 (size: 22.0 KB, free: 13.5 GB)
19/11/19 18:39:35 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/19 18:39:35 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/19 18:39:35 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/19 18:39:35 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 159, localhost, executor driver, partition 0, NODE_LOCAL, 7760 bytes)
19/11/19 18:39:35 INFO Executor: Running task 0.0 in stage 132.0 (TID 159)
19/11/19 18:39:35 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/19 18:39:35 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/19 18:39:35 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktesti8VQDn/job_a7a61b78-2f52-4c9d-9f17-c4d098358aa2/MANIFEST
19/11/19 18:39:35 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktesti8VQDn/job_a7a61b78-2f52-4c9d-9f17-c4d098358aa2/MANIFEST -> 0 artifacts
19/11/19 18:39:36 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/19 18:39:36 INFO main: Logging handler created.
19/11/19 18:39:36 INFO start: Status HTTP server running at localhost:43641
19/11/19 18:39:36 INFO main: semi_persistent_directory: /tmp
19/11/19 18:39:36 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/19 18:39:36 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574188772.51_2b65cc30-6e99-4365-8ea4-299df032ffd6', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/19 18:39:36 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574188772.51', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:47327'}
19/11/19 18:39:36 INFO __init__: Creating state cache with size 0
19/11/19 18:39:36 INFO __init__: Creating insecure control channel for localhost:34291.
19/11/19 18:39:36 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/19 18:39:36 INFO __init__: Control channel established.
19/11/19 18:39:36 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/19 18:39:36 INFO create_state_handler: Creating insecure state channel for localhost:36553.
19/11/19 18:39:36 INFO create_state_handler: State channel established.
19/11/19 18:39:36 INFO create_data_channel: Creating client data channel for localhost:35539
19/11/19 18:39:36 INFO GrpcDataService: Beam Fn Data client connected.
19/11/19 18:39:36 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/19 18:39:36 INFO run: No more requests from control plane
19/11/19 18:39:36 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 18:39:36 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 18:39:36 INFO close: Closing all cached grpc data channels.
19/11/19 18:39:36 INFO close: Closing all cached gRPC state handlers.
19/11/19 18:39:36 INFO run: Done consuming work.
19/11/19 18:39:36 INFO main: Python sdk harness exiting.
19/11/19 18:39:36 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 18:39:36 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 18:39:36 INFO Executor: Finished task 0.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/19 18:39:36 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 160, localhost, executor driver, partition 1, PROCESS_LOCAL, 7977 bytes)
19/11/19 18:39:36 INFO Executor: Running task 1.0 in stage 132.0 (TID 160)
19/11/19 18:39:36 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 159) in 1029 ms on localhost (executor driver) (1/2)
19/11/19 18:39:36 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktesti8VQDn/job_a7a61b78-2f52-4c9d-9f17-c4d098358aa2/MANIFEST
19/11/19 18:39:36 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktesti8VQDn/job_a7a61b78-2f52-4c9d-9f17-c4d098358aa2/MANIFEST -> 0 artifacts
19/11/19 18:39:37 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/19 18:39:37 INFO main: Logging handler created.
19/11/19 18:39:37 INFO start: Status HTTP server running at localhost:42305
19/11/19 18:39:37 INFO main: semi_persistent_directory: /tmp
19/11/19 18:39:37 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/19 18:39:37 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574188772.51_2b65cc30-6e99-4365-8ea4-299df032ffd6', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/19 18:39:37 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574188772.51', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:47327'}
19/11/19 18:39:37 INFO __init__: Creating state cache with size 0
19/11/19 18:39:37 INFO __init__: Creating insecure control channel for localhost:41083.
19/11/19 18:39:37 INFO __init__: Control channel established.
19/11/19 18:39:37 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/19 18:39:37 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/19 18:39:37 INFO create_state_handler: Creating insecure state channel for localhost:33445.
19/11/19 18:39:37 INFO create_state_handler: State channel established.
19/11/19 18:39:37 INFO create_data_channel: Creating client data channel for localhost:42889
19/11/19 18:39:37 INFO GrpcDataService: Beam Fn Data client connected.
19/11/19 18:39:37 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/19 18:39:37 INFO run: No more requests from control plane
19/11/19 18:39:37 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 18:39:37 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 18:39:37 INFO close: Closing all cached grpc data channels.
19/11/19 18:39:37 INFO close: Closing all cached gRPC state handlers.
19/11/19 18:39:37 INFO run: Done consuming work.
19/11/19 18:39:37 INFO main: Python sdk harness exiting.
19/11/19 18:39:37 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 18:39:37 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 18:39:37 INFO Executor: Finished task 1.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/19 18:39:37 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 160) in 926 ms on localhost (executor driver) (2/2)
19/11/19 18:39:37 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/19 18:39:37 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.961 s
19/11/19 18:39:37 INFO DAGScheduler: looking for newly runnable stages
19/11/19 18:39:37 INFO DAGScheduler: running: Set()
19/11/19 18:39:37 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/19 18:39:37 INFO DAGScheduler: failed: Set()
19/11/19 18:39:37 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/19 18:39:37 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/19 18:39:37 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.4 KB, free 13.5 GB)
19/11/19 18:39:37 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:33891 (size: 12.4 KB, free: 13.5 GB)
19/11/19 18:39:37 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/19 18:39:37 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/19 18:39:37 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/19 18:39:37 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/19 18:39:37 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/19 18:39:37 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/19 18:39:37 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/19 18:39:37 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktesti8VQDn/job_a7a61b78-2f52-4c9d-9f17-c4d098358aa2/MANIFEST
19/11/19 18:39:37 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktesti8VQDn/job_a7a61b78-2f52-4c9d-9f17-c4d098358aa2/MANIFEST -> 0 artifacts
19/11/19 18:39:38 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/19 18:39:38 INFO main: Logging handler created.
19/11/19 18:39:38 INFO start: Status HTTP server running at localhost:42409
19/11/19 18:39:38 INFO main: semi_persistent_directory: /tmp
19/11/19 18:39:38 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/19 18:39:38 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574188772.51_2b65cc30-6e99-4365-8ea4-299df032ffd6', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/19 18:39:38 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574188772.51', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:47327'}
19/11/19 18:39:38 INFO __init__: Creating state cache with size 0
19/11/19 18:39:38 INFO __init__: Creating insecure control channel for localhost:45719.
19/11/19 18:39:38 INFO __init__: Control channel established.
19/11/19 18:39:38 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/19 18:39:38 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/19 18:39:38 INFO create_state_handler: Creating insecure state channel for localhost:39893.
19/11/19 18:39:38 INFO create_state_handler: State channel established.
19/11/19 18:39:38 INFO create_data_channel: Creating client data channel for localhost:34371
19/11/19 18:39:38 INFO GrpcDataService: Beam Fn Data client connected.
19/11/19 18:39:38 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/19 18:39:38 INFO run: No more requests from control plane
19/11/19 18:39:38 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 18:39:38 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 18:39:38 INFO close: Closing all cached grpc data channels.
19/11/19 18:39:38 INFO close: Closing all cached gRPC state handlers.
19/11/19 18:39:38 INFO run: Done consuming work.
19/11/19 18:39:38 INFO main: Python sdk harness exiting.
19/11/19 18:39:38 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 18:39:38 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 18:39:38 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/19 18:39:38 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 932 ms on localhost (executor driver) (1/1)
19/11/19 18:39:38 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/19 18:39:38 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.938 s
19/11/19 18:39:38 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.780602 s
19/11/19 18:39:38 INFO SparkPipelineRunner: Job test_windowing_1574188772.51_2b65cc30-6e99-4365-8ea4-299df032ffd6 finished.
19/11/19 18:39:38 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/19 18:39:38 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktesti8VQDn/job_a7a61b78-2f52-4c9d-9f17-c4d098358aa2/MANIFEST has 0 artifact locations
19/11/19 18:39:38 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktesti8VQDn/job_a7a61b78-2f52-4c9d-9f17-c4d098358aa2/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140655105300224)>

======================================================================

ERROR: test_pardo_timers (__main__.SparkRunnerTest)
# Thread: <Thread(Thread-118, started daemon 140655088514816)>

----------------------------------------------------------------------
# Thread: <_MainThread(MainThread, started 140655884531456)>
==================== Timed out after 60 seconds. ====================

Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140654600513280)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
# Thread: <Thread(Thread-124, started daemon 140654608905984)>

    assert_that(actual, equal_to(expected))
# Thread: <Thread(Thread-118, started daemon 140655088514816)>

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <_MainThread(MainThread, started 140655884531456)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 140655105300224)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574188759.16_845802d2-6787-40d9-96b0-673f242bfb9a failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 320.629s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 15s
59 actionable tasks: 47 executed, 12 from cache

Publishing build scan...
https://gradle.com/s/vt33ruttuzcj2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1571

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1571/display/redirect?page=changes>

Changes:

[suztomo] [BEAM-8654] Fix resolutionStrategy's interference with dependency check

[suztomo] [BEAM-8737] Incorporate TRIAGE NEEDED status

[suztomo] Handle None object returned from jira_manager.run

[lukecwik] [BEAM-8729] Gracefully skip irrelevant http/https lines from

[lukecwik] [BEAM-8335] Add timestamp and duration to/from protos to Python SDK


------------------------------------------
[...truncated 1.65 MB...]
19/11/19 17:32:23 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 17:32:23 INFO close: Closing all cached grpc data channels.
19/11/19 17:32:23 INFO close: Closing all cached gRPC state handlers.
19/11/19 17:32:23 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 17:32:23 INFO run: Done consuming work.
19/11/19 17:32:23 INFO main: Python sdk harness exiting.
19/11/19 17:32:23 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 17:32:24 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 17:32:24 INFO Executor: Finished task 0.0 in stage 126.0 (TID 153). 15272 bytes result sent to driver
19/11/19 17:32:24 INFO TaskSetManager: Starting task 1.0 in stage 126.0 (TID 154, localhost, executor driver, partition 1, PROCESS_LOCAL, 7977 bytes)
19/11/19 17:32:24 INFO Executor: Running task 1.0 in stage 126.0 (TID 154)
19/11/19 17:32:24 INFO TaskSetManager: Finished task 0.0 in stage 126.0 (TID 153) in 935 ms on localhost (executor driver) (1/2)
19/11/19 17:32:24 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestsYPCfK/job_317c6739-21f8-413d-9cad-a0f90b1d3a93/MANIFEST
19/11/19 17:32:24 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestsYPCfK/job_317c6739-21f8-413d-9cad-a0f90b1d3a93/MANIFEST -> 0 artifacts
19/11/19 17:32:24 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/19 17:32:24 INFO main: Logging handler created.
19/11/19 17:32:24 INFO start: Status HTTP server running at localhost:41975
19/11/19 17:32:24 INFO main: semi_persistent_directory: /tmp
19/11/19 17:32:24 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/19 17:32:24 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574184740.42_21e78f11-33e8-40f0-aca0-1a905f760674', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=29', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/19 17:32:24 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574184740.42', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55319'}
19/11/19 17:32:24 INFO __init__: Creating state cache with size 0
19/11/19 17:32:24 INFO __init__: Creating insecure control channel for localhost:42057.
19/11/19 17:32:24 INFO __init__: Control channel established.
19/11/19 17:32:24 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 257-1
19/11/19 17:32:24 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/19 17:32:24 INFO create_state_handler: Creating insecure state channel for localhost:40177.
19/11/19 17:32:24 INFO create_state_handler: State channel established.
19/11/19 17:32:24 INFO create_data_channel: Creating client data channel for localhost:34615
19/11/19 17:32:24 INFO GrpcDataService: Beam Fn Data client connected.
19/11/19 17:32:24 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/19 17:32:24 INFO run: No more requests from control plane
19/11/19 17:32:24 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 17:32:24 INFO close: Closing all cached grpc data channels.
19/11/19 17:32:24 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 17:32:24 INFO close: Closing all cached gRPC state handlers.
19/11/19 17:32:24 INFO run: Done consuming work.
19/11/19 17:32:24 INFO main: Python sdk harness exiting.
19/11/19 17:32:24 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 17:32:24 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 17:32:24 INFO Executor: Finished task 1.0 in stage 126.0 (TID 154). 13710 bytes result sent to driver
19/11/19 17:32:24 INFO TaskSetManager: Finished task 1.0 in stage 126.0 (TID 154) in 831 ms on localhost (executor driver) (2/2)
19/11/19 17:32:24 INFO TaskSchedulerImpl: Removed TaskSet 126.0, whose tasks have all completed, from pool 
19/11/19 17:32:24 INFO DAGScheduler: ShuffleMapStage 126 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.772 s
19/11/19 17:32:24 INFO DAGScheduler: looking for newly runnable stages
19/11/19 17:32:24 INFO DAGScheduler: running: Set()
19/11/19 17:32:24 INFO DAGScheduler: waiting: Set(ResultStage 127)
19/11/19 17:32:24 INFO DAGScheduler: failed: Set()
19/11/19 17:32:24 INFO DAGScheduler: Submitting ResultStage 127 (EmptyOutputSink_0 MapPartitionsRDD[881] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/19 17:32:24 INFO MemoryStore: Block broadcast_124 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/19 17:32:24 INFO MemoryStore: Block broadcast_124_piece0 stored as bytes in memory (estimated size 12.4 KB, free 13.5 GB)
19/11/19 17:32:24 INFO BlockManagerInfo: Added broadcast_124_piece0 in memory on localhost:38405 (size: 12.4 KB, free: 13.5 GB)
19/11/19 17:32:24 INFO SparkContext: Created broadcast 124 from broadcast at DAGScheduler.scala:1161
19/11/19 17:32:24 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 127 (EmptyOutputSink_0 MapPartitionsRDD[881] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/19 17:32:24 INFO TaskSchedulerImpl: Adding task set 127.0 with 1 tasks
19/11/19 17:32:24 INFO TaskSetManager: Starting task 0.0 in stage 127.0 (TID 155, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/19 17:32:24 INFO Executor: Running task 0.0 in stage 127.0 (TID 155)
19/11/19 17:32:24 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/19 17:32:24 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/19 17:32:24 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestsYPCfK/job_317c6739-21f8-413d-9cad-a0f90b1d3a93/MANIFEST
19/11/19 17:32:24 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestsYPCfK/job_317c6739-21f8-413d-9cad-a0f90b1d3a93/MANIFEST -> 0 artifacts
19/11/19 17:32:25 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/19 17:32:25 INFO main: Logging handler created.
19/11/19 17:32:25 INFO start: Status HTTP server running at localhost:36013
19/11/19 17:32:25 INFO main: semi_persistent_directory: /tmp
19/11/19 17:32:25 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/19 17:32:25 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574184740.42_21e78f11-33e8-40f0-aca0-1a905f760674', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=29', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/19 17:32:25 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574184740.42', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55319'}
19/11/19 17:32:25 INFO __init__: Creating state cache with size 0
19/11/19 17:32:25 INFO __init__: Creating insecure control channel for localhost:41763.
19/11/19 17:32:25 INFO __init__: Control channel established.
19/11/19 17:32:25 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/19 17:32:25 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 258-1
19/11/19 17:32:25 INFO create_state_handler: Creating insecure state channel for localhost:46423.
19/11/19 17:32:25 INFO create_state_handler: State channel established.
19/11/19 17:32:25 INFO create_data_channel: Creating client data channel for localhost:33235
19/11/19 17:32:25 INFO GrpcDataService: Beam Fn Data client connected.
19/11/19 17:32:25 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/19 17:32:25 INFO run: No more requests from control plane
19/11/19 17:32:25 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 17:32:25 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 17:32:25 INFO close: Closing all cached grpc data channels.
19/11/19 17:32:25 INFO close: Closing all cached gRPC state handlers.
19/11/19 17:32:25 INFO run: Done consuming work.
19/11/19 17:32:25 INFO main: Python sdk harness exiting.
19/11/19 17:32:25 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 17:32:25 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 17:32:25 INFO Executor: Finished task 0.0 in stage 127.0 (TID 155). 11970 bytes result sent to driver
19/11/19 17:32:25 INFO TaskSetManager: Finished task 0.0 in stage 127.0 (TID 155) in 865 ms on localhost (executor driver) (1/1)
19/11/19 17:32:25 INFO TaskSchedulerImpl: Removed TaskSet 127.0, whose tasks have all completed, from pool 
19/11/19 17:32:25 INFO DAGScheduler: ResultStage 127 (foreach at BoundedDataset.java:124) finished in 0.871 s
19/11/19 17:32:25 INFO DAGScheduler: Job 45 finished: foreach at BoundedDataset.java:124, took 4.313323 s
19/11/19 17:32:25 INFO SparkPipelineRunner: Job test_windowing_1574184740.42_21e78f11-33e8-40f0-aca0-1a905f760674 finished.
19/11/19 17:32:25 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/19 17:32:25 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestsYPCfK/job_317c6739-21f8-413d-9cad-a0f90b1d3a93/MANIFEST has 0 artifact locations
19/11/19 17:32:25 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestsYPCfK/job_317c6739-21f8-413d-9cad-a0f90b1d3a93/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 139996563339008)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-120, started daemon 139996571731712)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next

# Thread: <_MainThread(MainThread, started 139997350962944)>
==================== Timed out after 60 seconds. ====================

    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 139996061230848)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
# Thread: <Thread(Thread-125, started daemon 139996069623552)>

    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-120, started daemon 139996571731712)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <_MainThread(MainThread, started 139997350962944)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
# Thread: <Thread(wait_until_finish_read, started daemon 139996563339008)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139996052838144)>

    raise BaseException(msg)
# Thread: <Thread(Thread-131, started daemon 139996044445440)>

BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-125, started daemon 139996069623552)>

======================================================================
# Thread: <_MainThread(MainThread, started 139997350962944)>

ERROR: test_pardo_unfusable_side_inputs (__main__.SparkRunnerTest)
# Thread: <Thread(wait_until_finish_read, started daemon 139996061230848)>
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 244, in test_pardo_unfusable_side_inputs
    equal_to([('a', 'a'), ('a', 'b'), ('b', 'a'), ('b', 'b')]))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574184731.42_53bfdfcf-b365-48e9-8fe3-6c7dd8e79eeb failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 347.867s

FAILED (errors=4, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 6s
59 actionable tasks: 58 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/mdgmwo3unr7fk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1570

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1570/display/redirect>

Changes:


------------------------------------------
[...truncated 1.67 MB...]
19/11/19 12:10:09 INFO TaskSchedulerImpl: Removed TaskSet 131.0, whose tasks have all completed, from pool 
19/11/19 12:10:09 INFO DAGScheduler: ShuffleMapStage 131 (mapToPair at GroupCombineFunctions.java:55) finished in 0.879 s
19/11/19 12:10:09 INFO DAGScheduler: looking for newly runnable stages
19/11/19 12:10:09 INFO DAGScheduler: running: Set()
19/11/19 12:10:09 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/19 12:10:09 INFO DAGScheduler: failed: Set()
19/11/19 12:10:09 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/19 12:10:09 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/19 12:10:09 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.0 KB, free 13.5 GB)
19/11/19 12:10:09 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:40511 (size: 22.0 KB, free: 13.5 GB)
19/11/19 12:10:09 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/19 12:10:09 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/19 12:10:09 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/19 12:10:09 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 159, localhost, executor driver, partition 0, NODE_LOCAL, 7760 bytes)
19/11/19 12:10:09 INFO Executor: Running task 0.0 in stage 132.0 (TID 159)
19/11/19 12:10:09 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/19 12:10:09 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/19 12:10:09 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestQ4BvfR/job_2eacb0f1-70cc-44ad-84a7-9dd6d7e00cd2/MANIFEST
19/11/19 12:10:09 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestQ4BvfR/job_2eacb0f1-70cc-44ad-84a7-9dd6d7e00cd2/MANIFEST -> 0 artifacts
19/11/19 12:10:09 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/19 12:10:09 INFO main: Logging handler created.
19/11/19 12:10:09 INFO start: Status HTTP server running at localhost:36875
19/11/19 12:10:09 INFO main: semi_persistent_directory: /tmp
19/11/19 12:10:09 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/19 12:10:09 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574165406.49_188c0f80-1ada-4fa8-9544-3a1320d311e6', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/19 12:10:09 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574165406.49', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37767'}
19/11/19 12:10:09 INFO __init__: Creating state cache with size 0
19/11/19 12:10:09 INFO __init__: Creating insecure control channel for localhost:44147.
19/11/19 12:10:09 INFO __init__: Control channel established.
19/11/19 12:10:09 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/19 12:10:09 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/19 12:10:09 INFO create_state_handler: Creating insecure state channel for localhost:36525.
19/11/19 12:10:09 INFO create_state_handler: State channel established.
19/11/19 12:10:09 INFO create_data_channel: Creating client data channel for localhost:45241
19/11/19 12:10:09 INFO GrpcDataService: Beam Fn Data client connected.
19/11/19 12:10:10 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/19 12:10:10 INFO run: No more requests from control plane
19/11/19 12:10:10 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 12:10:10 INFO close: Closing all cached grpc data channels.
19/11/19 12:10:10 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 12:10:10 INFO close: Closing all cached gRPC state handlers.
19/11/19 12:10:10 INFO run: Done consuming work.
19/11/19 12:10:10 INFO main: Python sdk harness exiting.
19/11/19 12:10:10 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 12:10:10 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 12:10:10 INFO Executor: Finished task 0.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/19 12:10:10 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 160, localhost, executor driver, partition 1, PROCESS_LOCAL, 7977 bytes)
19/11/19 12:10:10 INFO Executor: Running task 1.0 in stage 132.0 (TID 160)
19/11/19 12:10:10 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 159) in 907 ms on localhost (executor driver) (1/2)
19/11/19 12:10:10 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestQ4BvfR/job_2eacb0f1-70cc-44ad-84a7-9dd6d7e00cd2/MANIFEST
19/11/19 12:10:10 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestQ4BvfR/job_2eacb0f1-70cc-44ad-84a7-9dd6d7e00cd2/MANIFEST -> 0 artifacts
19/11/19 12:10:10 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/19 12:10:10 INFO main: Logging handler created.
19/11/19 12:10:10 INFO start: Status HTTP server running at localhost:41319
19/11/19 12:10:10 INFO main: semi_persistent_directory: /tmp
19/11/19 12:10:10 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/19 12:10:10 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574165406.49_188c0f80-1ada-4fa8-9544-3a1320d311e6', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/19 12:10:10 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574165406.49', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37767'}
19/11/19 12:10:10 INFO __init__: Creating state cache with size 0
19/11/19 12:10:10 INFO __init__: Creating insecure control channel for localhost:39379.
19/11/19 12:10:10 INFO __init__: Control channel established.
19/11/19 12:10:10 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/19 12:10:10 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/19 12:10:10 INFO create_state_handler: Creating insecure state channel for localhost:35443.
19/11/19 12:10:10 INFO create_state_handler: State channel established.
19/11/19 12:10:10 INFO create_data_channel: Creating client data channel for localhost:36587
19/11/19 12:10:10 INFO GrpcDataService: Beam Fn Data client connected.
19/11/19 12:10:10 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/19 12:10:10 INFO run: No more requests from control plane
19/11/19 12:10:10 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 12:10:10 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 12:10:10 INFO close: Closing all cached grpc data channels.
19/11/19 12:10:10 INFO close: Closing all cached gRPC state handlers.
19/11/19 12:10:10 INFO run: Done consuming work.
19/11/19 12:10:10 INFO main: Python sdk harness exiting.
19/11/19 12:10:10 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 12:10:10 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 12:10:10 INFO Executor: Finished task 1.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/19 12:10:10 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 160) in 834 ms on localhost (executor driver) (2/2)
19/11/19 12:10:10 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/19 12:10:10 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.746 s
19/11/19 12:10:10 INFO DAGScheduler: looking for newly runnable stages
19/11/19 12:10:10 INFO DAGScheduler: running: Set()
19/11/19 12:10:10 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/19 12:10:10 INFO DAGScheduler: failed: Set()
19/11/19 12:10:10 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/19 12:10:10 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/19 12:10:10 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.4 KB, free 13.5 GB)
19/11/19 12:10:10 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:40511 (size: 12.4 KB, free: 13.5 GB)
19/11/19 12:10:10 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/19 12:10:10 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/19 12:10:10 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/19 12:10:10 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/19 12:10:10 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/19 12:10:10 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/19 12:10:10 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/19 12:10:11 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestQ4BvfR/job_2eacb0f1-70cc-44ad-84a7-9dd6d7e00cd2/MANIFEST
19/11/19 12:10:11 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestQ4BvfR/job_2eacb0f1-70cc-44ad-84a7-9dd6d7e00cd2/MANIFEST -> 0 artifacts
19/11/19 12:10:11 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/19 12:10:11 INFO main: Logging handler created.
19/11/19 12:10:11 INFO start: Status HTTP server running at localhost:32995
19/11/19 12:10:11 INFO main: semi_persistent_directory: /tmp
19/11/19 12:10:11 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/19 12:10:11 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574165406.49_188c0f80-1ada-4fa8-9544-3a1320d311e6', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/19 12:10:11 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574165406.49', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37767'}
19/11/19 12:10:11 INFO __init__: Creating state cache with size 0
19/11/19 12:10:11 INFO __init__: Creating insecure control channel for localhost:32915.
19/11/19 12:10:11 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/19 12:10:11 INFO __init__: Control channel established.
19/11/19 12:10:11 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/19 12:10:11 INFO create_state_handler: Creating insecure state channel for localhost:44537.
19/11/19 12:10:11 INFO create_state_handler: State channel established.
19/11/19 12:10:11 INFO create_data_channel: Creating client data channel for localhost:36515
19/11/19 12:10:11 INFO GrpcDataService: Beam Fn Data client connected.
19/11/19 12:10:11 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/19 12:10:11 INFO run: No more requests from control plane
19/11/19 12:10:11 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 12:10:11 INFO close: Closing all cached grpc data channels.
19/11/19 12:10:11 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 12:10:11 INFO close: Closing all cached gRPC state handlers.
19/11/19 12:10:11 INFO run: Done consuming work.
19/11/19 12:10:11 INFO main: Python sdk harness exiting.
19/11/19 12:10:11 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 12:10:11 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 12:10:11 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/19 12:10:11 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 854 ms on localhost (executor driver) (1/1)
19/11/19 12:10:11 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/19 12:10:11 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.861 s
19/11/19 12:10:11 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.319864 s
19/11/19 12:10:11 INFO SparkPipelineRunner: Job test_windowing_1574165406.49_188c0f80-1ada-4fa8-9544-3a1320d311e6 finished.
19/11/19 12:10:11 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/19 12:10:11 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestQ4BvfR/job_2eacb0f1-70cc-44ad-84a7-9dd6d7e00cd2/MANIFEST has 0 artifact locations
19/11/19 12:10:11 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestQ4BvfR/job_2eacb0f1-70cc-44ad-84a7-9dd6d7e00cd2/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)

# Thread: <Thread(wait_until_finish_read, started daemon 140690732295936)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-117, started daemon 140690723903232)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140691520108288)>
==================== Timed out after 60 seconds. ====================

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140690243712768)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(lis# Thread: <Thread(Thread-123, started daemon 140690235320064)>

# Thread: <_MainThread(MainThread, started 140691520108288)>
t(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574165397.18_d55c3a48-ad70-4e96-853b-ff6906bbce29 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 299.689s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 40s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/qgxv2ufdcy6py

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1569

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1569/display/redirect?page=changes>

Changes:

[mxm] [BEAM-8672] Keep Python process alive when using LOOPBACK execution mode


------------------------------------------
[...truncated 1.65 MB...]
19/11/19 11:19:54 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/19 11:19:54 INFO DAGScheduler: failed: Set()
19/11/19 11:19:54 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/19 11:19:54 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/19 11:19:54 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.1 KB, free 13.5 GB)
19/11/19 11:19:54 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:33177 (size: 22.1 KB, free: 13.5 GB)
19/11/19 11:19:54 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/19 11:19:54 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/19 11:19:54 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/19 11:19:54 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 159, localhost, executor driver, partition 0, NODE_LOCAL, 7760 bytes)
19/11/19 11:19:54 INFO Executor: Running task 0.0 in stage 132.0 (TID 159)
19/11/19 11:19:54 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/19 11:19:54 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/19 11:19:54 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestrgHj7V/job_fe6b28e6-0f7b-4408-9a5d-74f295866710/MANIFEST
19/11/19 11:19:54 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestrgHj7V/job_fe6b28e6-0f7b-4408-9a5d-74f295866710/MANIFEST -> 0 artifacts
19/11/19 11:19:54 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/19 11:19:54 INFO main: Logging handler created.
19/11/19 11:19:54 INFO start: Status HTTP server running at localhost:33675
19/11/19 11:19:54 INFO main: semi_persistent_directory: /tmp
19/11/19 11:19:54 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/19 11:19:54 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574162390.74_10535a78-9302-4bfe-b8f1-5ea2356ddbd0', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/19 11:19:54 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574162390.74', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44747'}
19/11/19 11:19:54 INFO __init__: Creating state cache with size 0
19/11/19 11:19:54 INFO __init__: Creating insecure control channel for localhost:42031.
19/11/19 11:19:54 INFO __init__: Control channel established.
19/11/19 11:19:54 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/19 11:19:54 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/19 11:19:54 INFO create_state_handler: Creating insecure state channel for localhost:33325.
19/11/19 11:19:54 INFO create_state_handler: State channel established.
19/11/19 11:19:54 INFO create_data_channel: Creating client data channel for localhost:39005
19/11/19 11:19:54 INFO GrpcDataService: Beam Fn Data client connected.
19/11/19 11:19:54 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/19 11:19:55 INFO run: No more requests from control plane
19/11/19 11:19:55 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 11:19:55 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 11:19:55 INFO close: Closing all cached grpc data channels.
19/11/19 11:19:55 INFO close: Closing all cached gRPC state handlers.
19/11/19 11:19:55 INFO run: Done consuming work.
19/11/19 11:19:55 INFO main: Python sdk harness exiting.
19/11/19 11:19:55 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 11:19:55 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 11:19:55 INFO Executor: Finished task 0.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/19 11:19:55 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 160, localhost, executor driver, partition 1, PROCESS_LOCAL, 7977 bytes)
19/11/19 11:19:55 INFO Executor: Running task 1.0 in stage 132.0 (TID 160)
19/11/19 11:19:55 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 159) in 1063 ms on localhost (executor driver) (1/2)
19/11/19 11:19:55 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestrgHj7V/job_fe6b28e6-0f7b-4408-9a5d-74f295866710/MANIFEST
19/11/19 11:19:55 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestrgHj7V/job_fe6b28e6-0f7b-4408-9a5d-74f295866710/MANIFEST -> 0 artifacts
19/11/19 11:19:55 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/19 11:19:55 INFO main: Logging handler created.
19/11/19 11:19:55 INFO main: semi_persistent_directory: /tmp
19/11/19 11:19:55 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/19 11:19:55 INFO start: Status HTTP server running at localhost:40117
19/11/19 11:19:55 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574162390.74_10535a78-9302-4bfe-b8f1-5ea2356ddbd0', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/19 11:19:55 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574162390.74', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44747'}
19/11/19 11:19:55 INFO __init__: Creating state cache with size 0
19/11/19 11:19:55 INFO __init__: Creating insecure control channel for localhost:41305.
19/11/19 11:19:55 INFO __init__: Control channel established.
19/11/19 11:19:55 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/19 11:19:55 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/19 11:19:55 INFO create_state_handler: Creating insecure state channel for localhost:43003.
19/11/19 11:19:55 INFO create_state_handler: State channel established.
19/11/19 11:19:55 INFO create_data_channel: Creating client data channel for localhost:38635
19/11/19 11:19:55 INFO GrpcDataService: Beam Fn Data client connected.
19/11/19 11:19:55 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/19 11:19:55 INFO run: No more requests from control plane
19/11/19 11:19:55 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 11:19:55 INFO close: Closing all cached grpc data channels.
19/11/19 11:19:55 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 11:19:55 INFO close: Closing all cached gRPC state handlers.
19/11/19 11:19:55 INFO run: Done consuming work.
19/11/19 11:19:55 INFO main: Python sdk harness exiting.
19/11/19 11:19:55 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 11:19:56 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 11:19:56 INFO Executor: Finished task 1.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/19 11:19:56 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 160) in 922 ms on localhost (executor driver) (2/2)
19/11/19 11:19:56 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/19 11:19:56 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.991 s
19/11/19 11:19:56 INFO DAGScheduler: looking for newly runnable stages
19/11/19 11:19:56 INFO DAGScheduler: running: Set()
19/11/19 11:19:56 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/19 11:19:56 INFO DAGScheduler: failed: Set()
19/11/19 11:19:56 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/19 11:19:56 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/19 11:19:56 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.3 KB, free 13.5 GB)
19/11/19 11:19:56 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:33177 (size: 12.3 KB, free: 13.5 GB)
19/11/19 11:19:56 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/19 11:19:56 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/19 11:19:56 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/19 11:19:56 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/19 11:19:56 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/19 11:19:56 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/19 11:19:56 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/19 11:19:56 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestrgHj7V/job_fe6b28e6-0f7b-4408-9a5d-74f295866710/MANIFEST
19/11/19 11:19:56 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestrgHj7V/job_fe6b28e6-0f7b-4408-9a5d-74f295866710/MANIFEST -> 0 artifacts
19/11/19 11:19:56 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/19 11:19:56 INFO main: Logging handler created.
19/11/19 11:19:56 INFO start: Status HTTP server running at localhost:34047
19/11/19 11:19:56 INFO main: semi_persistent_directory: /tmp
19/11/19 11:19:56 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/19 11:19:56 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574162390.74_10535a78-9302-4bfe-b8f1-5ea2356ddbd0', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/19 11:19:56 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574162390.74', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44747'}
19/11/19 11:19:56 INFO __init__: Creating state cache with size 0
19/11/19 11:19:56 INFO __init__: Creating insecure control channel for localhost:46767.
19/11/19 11:19:56 INFO __init__: Control channel established.
19/11/19 11:19:56 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/19 11:19:56 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/19 11:19:56 INFO create_state_handler: Creating insecure state channel for localhost:39559.
19/11/19 11:19:56 INFO create_state_handler: State channel established.
19/11/19 11:19:56 INFO create_data_channel: Creating client data channel for localhost:38141
19/11/19 11:19:56 INFO GrpcDataService: Beam Fn Data client connected.
19/11/19 11:19:56 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/19 11:19:56 INFO run: No more requests from control plane
19/11/19 11:19:56 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 11:19:56 INFO close: Closing all cached grpc data channels.
19/11/19 11:19:56 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 11:19:56 INFO close: Closing all cached gRPC state handlers.
19/11/19 11:19:56 INFO run: Done consuming work.
19/11/19 11:19:56 INFO main: Python sdk harness exiting.
19/11/19 11:19:56 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 11:19:56 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 11:19:56 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/19 11:19:56 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 878 ms on localhost (executor driver) (1/1)
19/11/19 11:19:56 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/19 11:19:56 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.883 s
19/11/19 11:19:56 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.999449 s
19/11/19 11:19:56 INFO SparkPipelineRunner: Job test_windowing_1574162390.74_10535a78-9302-4bfe-b8f1-5ea2356ddbd0 finished.
19/11/19 11:19:56 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/19 11:19:56 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestrgHj7V/job_fe6b28e6-0f7b-4408-9a5d-74f295866710/MANIFEST has 0 artifact locations
19/11/19 11:19:56 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestrgHj7V/job_fe6b28e6-0f7b-4408-9a5d-74f295866710/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139847646816000)>

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(Thread-117, started daemon 139847155054336)>

  File "apache_beam/runners/portability/portable_ru# Thread: <_MainThread(MainThread, started 139848426047232)>
nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 139847138268928)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-122, started daemon 139847146661632)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-117, started daemon 139847155054336)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <_MainThread(MainThread, started 139848426047232)>

  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574162380.39_b367dd5c-c534-46dd-9a66-070465967f33 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <Thread(wait_until_finish_read, started daemon 139847646816000)>
----------------------------------------------------------------------
Ran 38 tests in 335.607s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 47s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/tbp3wadtccfny

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1568

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1568/display/redirect>

Changes:


------------------------------------------
[...truncated 1.74 MB...]

# Thread: <Thread(wait_until_finish_read, started daemon 140030286751488)>

# Thread: <Thread(Thread-125, started daemon 140030295144192)>

# Thread: <_MainThread(MainThread, started 140031175235328)>

# Thread: <Thread(Thread-120, started daemon 140030387402496)>

# Thread: <Thread(wait_until_finish_read, started daemon 140030379009792)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140030253180672)>

# Thread: <Thread(Thread-134, started daemon 140030261573376)>

# Thread: <Thread(Thread-120, started daemon 140030387402496)>

# Thread: <Thread(wait_until_finish_read, started daemon 140030269966080)>

# Thread: <Thread(Thread-130, started daemon 140030278358784)>

# Thread: <Thread(wait_until_finish_read, started daemon 140030379009792)>

# Thread: <Thread(Thread-125, started daemon 140030295144192)>

# Thread: <_MainThread(MainThread, started 140031175235328)>

# Thread: <Thread(wait_until_finish_read, started daemon 140030286751488)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140029355620096)>

# Thread: <Thread(Thread-138, started daemon 140030244787968)>

# Thread: <Thread(Thread-134, started daemon 140030261573376)>

# Thread: <Thread(Thread-130, started daemon 140030278358784)>

# Thread: <Thread(wait_until_finish_read, started daemon 140030286751488)>

# Thread: <Thread(Thread-125, started daemon 140030295144192)>

# Thread: <Thread(wait_until_finish_read, started daemon 140030253180672)>

# Thread: <_MainThread(MainThread, started 140031175235328)>

# Thread: <Thread(wait_until_finish_read, started daemon 140030269966080)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140030395795200)>

# Thread: <Thread(Thread-142, started daemon 140030387402496)>

# Thread: <Thread(Thread-134, started daemon 140030261573376)>

# Thread: <Thread(Thread-130, started daemon 140030278358784)>

# Thread: <Thread(wait_until_finish_read, started daemon 140030253180672)>

# Thread: <_MainThread(MainThread, started 140031175235328)>

# Thread: <Thread(wait_until_finish_read, started daemon 140030269966080)>
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_unfusable_side_inputs (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 244, in test_pardo_unfusable_side_inputs
    equal_to([('a', 'a'), ('a', 'b'), ('b', 'a'), ('b', 'b')]))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_windowed_side_inputs (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 181, in test_pardo_windowed_side_inputs
    label='windowed')
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_read (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 578, in test_read
    equal_to(['a', 'b', 'c']))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_reshuffle (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 548, in test_reshuffle
    equal_to([1, 2, 3]))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 431, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574144456.18_1439b51b-8901-4a41-887f-b0c424e5d4e6 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 612.466s

FAILED (errors=7, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 12m 52s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/6kej7ob675d7c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1567

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1567/display/redirect?page=changes>

Changes:

[github] Merge pull request #10003: [BEAM-6756] Create Iterable type for Schema


------------------------------------------
[...truncated 1.67 MB...]
19/11/19 04:30:58 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/19 04:30:58 INFO DAGScheduler: failed: Set()
19/11/19 04:30:58 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/19 04:30:58 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/19 04:30:58 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.0 KB, free 13.5 GB)
19/11/19 04:30:58 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:44289 (size: 22.0 KB, free: 13.5 GB)
19/11/19 04:30:58 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/19 04:30:58 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/19 04:30:58 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/19 04:30:58 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 159, localhost, executor driver, partition 0, NODE_LOCAL, 7760 bytes)
19/11/19 04:30:58 INFO Executor: Running task 0.0 in stage 132.0 (TID 159)
19/11/19 04:30:58 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/19 04:30:58 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/19 04:30:58 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestZbbsUu/job_7933c10b-c7ac-4845-8956-5cc4e706b53e/MANIFEST
19/11/19 04:30:58 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestZbbsUu/job_7933c10b-c7ac-4845-8956-5cc4e706b53e/MANIFEST -> 0 artifacts
19/11/19 04:30:59 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/19 04:30:59 INFO main: Logging handler created.
19/11/19 04:30:59 INFO start: Status HTTP server running at localhost:44415
19/11/19 04:30:59 INFO main: semi_persistent_directory: /tmp
19/11/19 04:30:59 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/19 04:30:59 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574137855.93_88d536d1-7ffe-425e-9997-90bdf6c3ebae', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/19 04:30:59 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574137855.93', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37675'}
19/11/19 04:30:59 INFO __init__: Creating state cache with size 0
19/11/19 04:30:59 INFO __init__: Creating insecure control channel for localhost:32849.
19/11/19 04:30:59 INFO __init__: Control channel established.
19/11/19 04:30:59 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/19 04:30:59 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/19 04:30:59 INFO create_state_handler: Creating insecure state channel for localhost:38639.
19/11/19 04:30:59 INFO create_state_handler: State channel established.
19/11/19 04:30:59 INFO create_data_channel: Creating client data channel for localhost:42073
19/11/19 04:30:59 INFO GrpcDataService: Beam Fn Data client connected.
19/11/19 04:30:59 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/19 04:30:59 INFO run: No more requests from control plane
19/11/19 04:30:59 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 04:30:59 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 04:30:59 INFO close: Closing all cached grpc data channels.
19/11/19 04:30:59 INFO close: Closing all cached gRPC state handlers.
19/11/19 04:30:59 INFO run: Done consuming work.
19/11/19 04:30:59 INFO main: Python sdk harness exiting.
19/11/19 04:30:59 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 04:30:59 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 04:30:59 INFO Executor: Finished task 0.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/19 04:30:59 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 160, localhost, executor driver, partition 1, PROCESS_LOCAL, 7977 bytes)
19/11/19 04:30:59 INFO Executor: Running task 1.0 in stage 132.0 (TID 160)
19/11/19 04:30:59 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 159) in 930 ms on localhost (executor driver) (1/2)
19/11/19 04:30:59 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestZbbsUu/job_7933c10b-c7ac-4845-8956-5cc4e706b53e/MANIFEST
19/11/19 04:30:59 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestZbbsUu/job_7933c10b-c7ac-4845-8956-5cc4e706b53e/MANIFEST -> 0 artifacts
19/11/19 04:31:00 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/19 04:31:00 INFO main: Logging handler created.
19/11/19 04:31:00 INFO start: Status HTTP server running at localhost:43773
19/11/19 04:31:00 INFO main: semi_persistent_directory: /tmp
19/11/19 04:31:00 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/19 04:31:00 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574137855.93_88d536d1-7ffe-425e-9997-90bdf6c3ebae', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/19 04:31:00 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574137855.93', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37675'}
19/11/19 04:31:00 INFO __init__: Creating state cache with size 0
19/11/19 04:31:00 INFO __init__: Creating insecure control channel for localhost:41965.
19/11/19 04:31:00 INFO __init__: Control channel established.
19/11/19 04:31:00 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/19 04:31:00 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/19 04:31:00 INFO create_state_handler: Creating insecure state channel for localhost:39167.
19/11/19 04:31:00 INFO create_state_handler: State channel established.
19/11/19 04:31:00 INFO create_data_channel: Creating client data channel for localhost:46853
19/11/19 04:31:00 INFO GrpcDataService: Beam Fn Data client connected.
19/11/19 04:31:00 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/19 04:31:00 INFO run: No more requests from control plane
19/11/19 04:31:00 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 04:31:00 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 04:31:00 INFO close: Closing all cached grpc data channels.
19/11/19 04:31:00 INFO close: Closing all cached gRPC state handlers.
19/11/19 04:31:00 INFO run: Done consuming work.
19/11/19 04:31:00 INFO main: Python sdk harness exiting.
19/11/19 04:31:00 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 04:31:00 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 04:31:00 INFO Executor: Finished task 1.0 in stage 132.0 (TID 160). 13753 bytes result sent to driver
19/11/19 04:31:00 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 160) in 923 ms on localhost (executor driver) (2/2)
19/11/19 04:31:00 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/19 04:31:00 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.860 s
19/11/19 04:31:00 INFO DAGScheduler: looking for newly runnable stages
19/11/19 04:31:00 INFO DAGScheduler: running: Set()
19/11/19 04:31:00 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/19 04:31:00 INFO DAGScheduler: failed: Set()
19/11/19 04:31:00 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/19 04:31:00 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/19 04:31:00 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.3 KB, free 13.5 GB)
19/11/19 04:31:00 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:44289 (size: 12.3 KB, free: 13.5 GB)
19/11/19 04:31:00 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/19 04:31:00 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/19 04:31:00 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/19 04:31:00 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/19 04:31:00 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/19 04:31:00 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/19 04:31:00 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/19 04:31:00 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestZbbsUu/job_7933c10b-c7ac-4845-8956-5cc4e706b53e/MANIFEST
19/11/19 04:31:00 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestZbbsUu/job_7933c10b-c7ac-4845-8956-5cc4e706b53e/MANIFEST -> 0 artifacts
19/11/19 04:31:01 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/19 04:31:01 INFO main: Logging handler created.
19/11/19 04:31:01 INFO start: Status HTTP server running at localhost:40351
19/11/19 04:31:01 INFO main: semi_persistent_directory: /tmp
19/11/19 04:31:01 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/19 04:31:01 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574137855.93_88d536d1-7ffe-425e-9997-90bdf6c3ebae', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/19 04:31:01 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574137855.93', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37675'}
19/11/19 04:31:01 INFO __init__: Creating state cache with size 0
19/11/19 04:31:01 INFO __init__: Creating insecure control channel for localhost:38725.
19/11/19 04:31:01 INFO __init__: Control channel established.
19/11/19 04:31:01 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/19 04:31:01 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/19 04:31:01 INFO create_state_handler: Creating insecure state channel for localhost:42257.
19/11/19 04:31:01 INFO create_state_handler: State channel established.
19/11/19 04:31:01 INFO create_data_channel: Creating client data channel for localhost:41275
19/11/19 04:31:01 INFO GrpcDataService: Beam Fn Data client connected.
19/11/19 04:31:01 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/19 04:31:01 INFO run: No more requests from control plane
19/11/19 04:31:01 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 04:31:01 INFO close: Closing all cached grpc data channels.
19/11/19 04:31:01 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 04:31:01 INFO close: Closing all cached gRPC state handlers.
19/11/19 04:31:01 INFO run: Done consuming work.
19/11/19 04:31:01 INFO main: Python sdk harness exiting.
19/11/19 04:31:01 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 04:31:01 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 04:31:01 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 12013 bytes result sent to driver
19/11/19 04:31:01 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 857 ms on localhost (executor driver) (1/1)
19/11/19 04:31:01 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/19 04:31:01 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.863 s
19/11/19 04:31:01 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.632269 s
19/11/19 04:31:01 INFO SparkPipelineRunner: Job test_windowing_1574137855.93_88d536d1-7ffe-425e-9997-90bdf6c3ebae finished.
19/11/19 04:31:01 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/19 04:31:01 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestZbbsUu/job_7933c10b-c7ac-4845-8956-5cc4e706b53e/MANIFEST has 0 artifact locations
19/11/19 04:31:01 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestZbbsUu/job_7933c10b-c7ac-4845-8956-5cc4e706b53e/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
==================== Timed out after 60 seconds. ====================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers

# Thread: <Thread(wait_until_finish_read, started daemon 140106343380736)>

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_ru# Thread: <Thread(Thread-119, started daemon 140105993942784)>

nner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 140107122611968)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 140105968764672)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-125, started daemon 140105977157376)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 140107122611968)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(Thread-119, started daemon 140105993942784)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 140106343380736)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 431, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574137846.54_7042fa21-8681-4ee4-be55-b4c4e088deb0 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 304.518s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 42s
59 actionable tasks: 47 executed, 12 from cache

Publishing build scan...
https://gradle.com/s/yein2ayvtcx4k

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1566

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1566/display/redirect>

Changes:


------------------------------------------
[...truncated 1.64 MB...]
19/11/19 00:34:45 INFO main: Logging handler created.
19/11/19 00:34:45 INFO start: Status HTTP server running at localhost:45549
19/11/19 00:34:45 INFO main: semi_persistent_directory: /tmp
19/11/19 00:34:45 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/19 00:34:45 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574123682.46_9ba8105d-a79a-41d8-b843-3625e73f823a', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/19 00:34:45 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574123682.46', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42533'}
19/11/19 00:34:45 INFO __init__: Creating state cache with size 0
19/11/19 00:34:45 INFO __init__: Creating insecure control channel for localhost:43951.
19/11/19 00:34:45 INFO __init__: Control channel established.
19/11/19 00:34:45 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/19 00:34:45 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/19 00:34:45 INFO create_state_handler: Creating insecure state channel for localhost:34333.
19/11/19 00:34:45 INFO create_state_handler: State channel established.
19/11/19 00:34:45 INFO create_data_channel: Creating client data channel for localhost:39617
19/11/19 00:34:45 INFO GrpcDataService: Beam Fn Data client connected.
19/11/19 00:34:45 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/19 00:34:45 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/19 00:34:45 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/19 00:34:45 INFO run: No more requests from control plane
19/11/19 00:34:45 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 00:34:45 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 00:34:45 INFO close: Closing all cached grpc data channels.
19/11/19 00:34:45 INFO close: Closing all cached gRPC state handlers.
19/11/19 00:34:45 INFO run: Done consuming work.
19/11/19 00:34:45 INFO main: Python sdk harness exiting.
19/11/19 00:34:45 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 00:34:45 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 00:34:45 INFO Executor: Finished task 0.0 in stage 131.0 (TID 158). 12763 bytes result sent to driver
19/11/19 00:34:45 INFO TaskSetManager: Finished task 0.0 in stage 131.0 (TID 158) in 920 ms on localhost (executor driver) (1/1)
19/11/19 00:34:45 INFO TaskSchedulerImpl: Removed TaskSet 131.0, whose tasks have all completed, from pool 
19/11/19 00:34:45 INFO DAGScheduler: ShuffleMapStage 131 (mapToPair at GroupCombineFunctions.java:55) finished in 0.927 s
19/11/19 00:34:45 INFO DAGScheduler: looking for newly runnable stages
19/11/19 00:34:45 INFO DAGScheduler: running: Set()
19/11/19 00:34:45 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/19 00:34:45 INFO DAGScheduler: failed: Set()
19/11/19 00:34:45 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/19 00:34:45 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/19 00:34:45 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.0 KB, free 13.5 GB)
19/11/19 00:34:45 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:43689 (size: 22.0 KB, free: 13.5 GB)
19/11/19 00:34:45 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/19 00:34:45 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/19 00:34:45 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/19 00:34:45 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 159, localhost, executor driver, partition 0, NODE_LOCAL, 7760 bytes)
19/11/19 00:34:45 INFO Executor: Running task 0.0 in stage 132.0 (TID 159)
19/11/19 00:34:45 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/19 00:34:45 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
19/11/19 00:34:45 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest1SsyWR/job_b4bf8464-309d-4975-9f99-6db681fe8700/MANIFEST
19/11/19 00:34:45 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest1SsyWR/job_b4bf8464-309d-4975-9f99-6db681fe8700/MANIFEST -> 0 artifacts
19/11/19 00:34:45 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/19 00:34:45 INFO main: Logging handler created.
19/11/19 00:34:45 INFO start: Status HTTP server running at localhost:42879
19/11/19 00:34:45 INFO main: semi_persistent_directory: /tmp
19/11/19 00:34:45 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/19 00:34:45 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574123682.46_9ba8105d-a79a-41d8-b843-3625e73f823a', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/19 00:34:45 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574123682.46', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42533'}
19/11/19 00:34:45 INFO __init__: Creating state cache with size 0
19/11/19 00:34:45 INFO __init__: Creating insecure control channel for localhost:35913.
19/11/19 00:34:45 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/19 00:34:45 INFO __init__: Control channel established.
19/11/19 00:34:45 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/19 00:34:45 INFO create_state_handler: Creating insecure state channel for localhost:43001.
19/11/19 00:34:45 INFO create_state_handler: State channel established.
19/11/19 00:34:46 INFO create_data_channel: Creating client data channel for localhost:35471
19/11/19 00:34:46 INFO GrpcDataService: Beam Fn Data client connected.
19/11/19 00:34:46 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/19 00:34:46 INFO run: No more requests from control plane
19/11/19 00:34:46 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 00:34:46 INFO close: Closing all cached grpc data channels.
19/11/19 00:34:46 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 00:34:46 INFO close: Closing all cached gRPC state handlers.
19/11/19 00:34:46 INFO run: Done consuming work.
19/11/19 00:34:46 INFO main: Python sdk harness exiting.
19/11/19 00:34:46 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 00:34:46 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 00:34:46 INFO Executor: Finished task 0.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/19 00:34:46 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 160, localhost, executor driver, partition 1, PROCESS_LOCAL, 7977 bytes)
19/11/19 00:34:46 INFO Executor: Running task 1.0 in stage 132.0 (TID 160)
19/11/19 00:34:46 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 159) in 933 ms on localhost (executor driver) (1/2)
19/11/19 00:34:46 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest1SsyWR/job_b4bf8464-309d-4975-9f99-6db681fe8700/MANIFEST
19/11/19 00:34:46 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest1SsyWR/job_b4bf8464-309d-4975-9f99-6db681fe8700/MANIFEST -> 0 artifacts
19/11/19 00:34:46 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/19 00:34:46 INFO main: Logging handler created.
19/11/19 00:34:46 INFO start: Status HTTP server running at localhost:37201
19/11/19 00:34:46 INFO main: semi_persistent_directory: /tmp
19/11/19 00:34:46 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/19 00:34:46 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574123682.46_9ba8105d-a79a-41d8-b843-3625e73f823a', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/19 00:34:46 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574123682.46', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42533'}
19/11/19 00:34:46 INFO __init__: Creating state cache with size 0
19/11/19 00:34:46 INFO __init__: Creating insecure control channel for localhost:35583.
19/11/19 00:34:46 INFO __init__: Control channel established.
19/11/19 00:34:46 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/19 00:34:46 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/19 00:34:46 INFO create_state_handler: Creating insecure state channel for localhost:44221.
19/11/19 00:34:46 INFO create_state_handler: State channel established.
19/11/19 00:34:46 INFO create_data_channel: Creating client data channel for localhost:36207
19/11/19 00:34:46 INFO GrpcDataService: Beam Fn Data client connected.
19/11/19 00:34:46 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/19 00:34:46 INFO run: No more requests from control plane
19/11/19 00:34:46 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 00:34:46 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 00:34:46 INFO close: Closing all cached grpc data channels.
19/11/19 00:34:46 INFO close: Closing all cached gRPC state handlers.
19/11/19 00:34:46 INFO run: Done consuming work.
19/11/19 00:34:46 INFO main: Python sdk harness exiting.
19/11/19 00:34:46 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 00:34:47 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 00:34:47 INFO Executor: Finished task 1.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/19 00:34:47 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 160) in 869 ms on localhost (executor driver) (2/2)
19/11/19 00:34:47 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/19 00:34:47 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.807 s
19/11/19 00:34:47 INFO DAGScheduler: looking for newly runnable stages
19/11/19 00:34:47 INFO DAGScheduler: running: Set()
19/11/19 00:34:47 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/19 00:34:47 INFO DAGScheduler: failed: Set()
19/11/19 00:34:47 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/19 00:34:47 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/19 00:34:47 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.3 KB, free 13.5 GB)
19/11/19 00:34:47 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:43689 (size: 12.3 KB, free: 13.5 GB)
19/11/19 00:34:47 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/19 00:34:47 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/19 00:34:47 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/19 00:34:47 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/19 00:34:47 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/19 00:34:47 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/19 00:34:47 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
19/11/19 00:34:47 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest1SsyWR/job_b4bf8464-309d-4975-9f99-6db681fe8700/MANIFEST
19/11/19 00:34:47 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest1SsyWR/job_b4bf8464-309d-4975-9f99-6db681fe8700/MANIFEST -> 0 artifacts
19/11/19 00:34:47 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/19 00:34:47 INFO main: Logging handler created.
19/11/19 00:34:47 INFO start: Status HTTP server running at localhost:40103
19/11/19 00:34:47 INFO main: semi_persistent_directory: /tmp
19/11/19 00:34:47 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/19 00:34:47 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574123682.46_9ba8105d-a79a-41d8-b843-3625e73f823a', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/19 00:34:47 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574123682.46', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42533'}
19/11/19 00:34:47 INFO __init__: Creating state cache with size 0
19/11/19 00:34:47 INFO __init__: Creating insecure control channel for localhost:43527.
19/11/19 00:34:47 INFO __init__: Control channel established.
19/11/19 00:34:47 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/19 00:34:47 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/19 00:34:47 INFO create_state_handler: Creating insecure state channel for localhost:33051.
19/11/19 00:34:47 INFO create_state_handler: State channel established.
19/11/19 00:34:47 INFO create_data_channel: Creating client data channel for localhost:38419
19/11/19 00:34:47 INFO GrpcDataService: Beam Fn Data client connected.
19/11/19 00:34:47 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/19 00:34:47 INFO run: No more requests from control plane
19/11/19 00:34:47 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/19 00:34:47 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 00:34:47 INFO close: Closing all cached grpc data channels.
19/11/19 00:34:47 INFO close: Closing all cached gRPC state handlers.
19/11/19 00:34:47 INFO run: Done consuming work.
19/11/19 00:34:47 INFO main: Python sdk harness exiting.
19/11/19 00:34:47 INFO GrpcLoggingService: Logging client hanged up.
19/11/19 00:34:47 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/19 00:34:47 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 12013 bytes result sent to driver
19/11/19 00:34:47 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 892 ms on localhost (executor driver) (1/1)
19/11/19 00:34:47 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/19 00:34:47 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.898 s
19/11/19 00:34:47 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.489692 s
19/11/19 00:34:47 INFO SparkPipelineRunner: Job test_windowing_1574123682.46_9ba8105d-a79a-41d8-b843-3625e73f823a finished.
19/11/19 00:34:47 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/19 00:34:47 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktest1SsyWR/job_b4bf8464-309d-4975-9f99-6db681fe8700/MANIFEST has 0 artifact locations
19/11/19 00:34:47 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktest1SsyWR/job_b4bf8464-309d-4975-9f99-6db681fe8700/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
# Thread: <Thread(wait_until_finish_read, started daemon 139879317444352)>

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(Thread-119, started daemon 139879334229760)>

  File "apache_beam/runners/portability/portable_runner.py", line 431, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <_MainThread(MainThread, started 139880113669888)>
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574123673.24_0a75eac1-1a1c-49a1-af26-9f57844291d3 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 296.400s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 41s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/zlbfyglqv7bdk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1565

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1565/display/redirect>

Changes:


------------------------------------------
[...truncated 1.67 MB...]
19/11/18 18:17:47 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/18 18:17:47 INFO DAGScheduler: failed: Set()
19/11/18 18:17:47 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/18 18:17:47 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/18 18:17:47 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.1 KB, free 13.5 GB)
19/11/18 18:17:47 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:39939 (size: 22.1 KB, free: 13.5 GB)
19/11/18 18:17:47 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/18 18:17:47 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/18 18:17:47 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/18 18:17:47 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 159, localhost, executor driver, partition 0, NODE_LOCAL, 7760 bytes)
19/11/18 18:17:47 INFO Executor: Running task 0.0 in stage 132.0 (TID 159)
19/11/18 18:17:47 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/18 18:17:47 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/18 18:17:47 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestoQbvKB/job_e3d80fdf-b904-4837-9dd0-6b5ae8f911bd/MANIFEST
19/11/18 18:17:47 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestoQbvKB/job_e3d80fdf-b904-4837-9dd0-6b5ae8f911bd/MANIFEST -> 0 artifacts
19/11/18 18:17:47 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/18 18:17:47 INFO main: Logging handler created.
19/11/18 18:17:47 INFO start: Status HTTP server running at localhost:42605
19/11/18 18:17:47 INFO main: semi_persistent_directory: /tmp
19/11/18 18:17:47 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/18 18:17:47 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574101064.71_7017dd03-030d-4929-a7c2-ab3520d966f9', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/18 18:17:47 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574101064.71', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57543'}
19/11/18 18:17:47 INFO __init__: Creating state cache with size 0
19/11/18 18:17:47 INFO __init__: Creating insecure control channel for localhost:33733.
19/11/18 18:17:47 INFO __init__: Control channel established.
19/11/18 18:17:47 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/18 18:17:47 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/18 18:17:47 INFO create_state_handler: Creating insecure state channel for localhost:38557.
19/11/18 18:17:47 INFO create_state_handler: State channel established.
19/11/18 18:17:47 INFO create_data_channel: Creating client data channel for localhost:38973
19/11/18 18:17:47 INFO GrpcDataService: Beam Fn Data client connected.
19/11/18 18:17:48 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/18 18:17:48 INFO run: No more requests from control plane
19/11/18 18:17:48 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/18 18:17:48 INFO close: Closing all cached grpc data channels.
19/11/18 18:17:48 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 18:17:48 INFO close: Closing all cached gRPC state handlers.
19/11/18 18:17:48 INFO run: Done consuming work.
19/11/18 18:17:48 INFO main: Python sdk harness exiting.
19/11/18 18:17:48 INFO GrpcLoggingService: Logging client hanged up.
19/11/18 18:17:48 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 18:17:48 INFO Executor: Finished task 0.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/18 18:17:48 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 160, localhost, executor driver, partition 1, PROCESS_LOCAL, 7977 bytes)
19/11/18 18:17:48 INFO Executor: Running task 1.0 in stage 132.0 (TID 160)
19/11/18 18:17:48 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 159) in 840 ms on localhost (executor driver) (1/2)
19/11/18 18:17:48 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestoQbvKB/job_e3d80fdf-b904-4837-9dd0-6b5ae8f911bd/MANIFEST
19/11/18 18:17:48 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestoQbvKB/job_e3d80fdf-b904-4837-9dd0-6b5ae8f911bd/MANIFEST -> 0 artifacts
19/11/18 18:17:48 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/18 18:17:48 INFO main: Logging handler created.
19/11/18 18:17:48 INFO start: Status HTTP server running at localhost:36053
19/11/18 18:17:48 INFO main: semi_persistent_directory: /tmp
19/11/18 18:17:48 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/18 18:17:48 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574101064.71_7017dd03-030d-4929-a7c2-ab3520d966f9', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/18 18:17:48 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574101064.71', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57543'}
19/11/18 18:17:48 INFO __init__: Creating state cache with size 0
19/11/18 18:17:48 INFO __init__: Creating insecure control channel for localhost:44255.
19/11/18 18:17:48 INFO __init__: Control channel established.
19/11/18 18:17:48 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/18 18:17:48 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/18 18:17:48 INFO create_state_handler: Creating insecure state channel for localhost:35773.
19/11/18 18:17:48 INFO create_state_handler: State channel established.
19/11/18 18:17:48 INFO create_data_channel: Creating client data channel for localhost:33603
19/11/18 18:17:48 INFO GrpcDataService: Beam Fn Data client connected.
19/11/18 18:17:48 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/18 18:17:48 INFO run: No more requests from control plane
19/11/18 18:17:48 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/18 18:17:48 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 18:17:48 INFO close: Closing all cached grpc data channels.
19/11/18 18:17:48 INFO close: Closing all cached gRPC state handlers.
19/11/18 18:17:48 INFO run: Done consuming work.
19/11/18 18:17:48 INFO main: Python sdk harness exiting.
19/11/18 18:17:48 INFO GrpcLoggingService: Logging client hanged up.
19/11/18 18:17:48 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 18:17:48 INFO Executor: Finished task 1.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/18 18:17:48 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 160) in 769 ms on localhost (executor driver) (2/2)
19/11/18 18:17:48 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/18 18:17:48 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.613 s
19/11/18 18:17:48 INFO DAGScheduler: looking for newly runnable stages
19/11/18 18:17:48 INFO DAGScheduler: running: Set()
19/11/18 18:17:48 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/18 18:17:48 INFO DAGScheduler: failed: Set()
19/11/18 18:17:48 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/18 18:17:48 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/18 18:17:48 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.4 KB, free 13.5 GB)
19/11/18 18:17:48 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:39939 (size: 12.4 KB, free: 13.5 GB)
19/11/18 18:17:48 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/18 18:17:48 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/18 18:17:48 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/18 18:17:48 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/18 18:17:48 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/18 18:17:48 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/18 18:17:48 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/18 18:17:48 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestoQbvKB/job_e3d80fdf-b904-4837-9dd0-6b5ae8f911bd/MANIFEST
19/11/18 18:17:48 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestoQbvKB/job_e3d80fdf-b904-4837-9dd0-6b5ae8f911bd/MANIFEST -> 0 artifacts
19/11/18 18:17:49 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/18 18:17:49 INFO main: Logging handler created.
19/11/18 18:17:49 INFO start: Status HTTP server running at localhost:43633
19/11/18 18:17:49 INFO main: semi_persistent_directory: /tmp
19/11/18 18:17:49 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/18 18:17:49 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574101064.71_7017dd03-030d-4929-a7c2-ab3520d966f9', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/18 18:17:49 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574101064.71', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57543'}
19/11/18 18:17:49 INFO __init__: Creating state cache with size 0
19/11/18 18:17:49 INFO __init__: Creating insecure control channel for localhost:40739.
19/11/18 18:17:49 INFO __init__: Control channel established.
19/11/18 18:17:49 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/18 18:17:49 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/18 18:17:49 INFO create_state_handler: Creating insecure state channel for localhost:38645.
19/11/18 18:17:49 INFO create_state_handler: State channel established.
19/11/18 18:17:49 INFO create_data_channel: Creating client data channel for localhost:37815
19/11/18 18:17:49 INFO GrpcDataService: Beam Fn Data client connected.
19/11/18 18:17:49 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/18 18:17:49 INFO run: No more requests from control plane
19/11/18 18:17:49 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/18 18:17:49 INFO close: Closing all cached grpc data channels.
19/11/18 18:17:49 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 18:17:49 INFO close: Closing all cached gRPC state handlers.
19/11/18 18:17:49 INFO run: Done consuming work.
19/11/18 18:17:49 INFO main: Python sdk harness exiting.
19/11/18 18:17:49 INFO GrpcLoggingService: Logging client hanged up.
19/11/18 18:17:49 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 18:17:49 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/18 18:17:49 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 789 ms on localhost (executor driver) (1/1)
19/11/18 18:17:49 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/18 18:17:49 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.795 s
19/11/18 18:17:49 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.015123 s
19/11/18 18:17:49 INFO SparkPipelineRunner: Job test_windowing_1574101064.71_7017dd03-030d-4929-a7c2-ab3520d966f9 finished.
19/11/18 18:17:49 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/18 18:17:49 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestoQbvKB/job_e3d80fdf-b904-4837-9dd0-6b5ae8f911bd/MANIFEST has 0 artifact locations
19/11/18 18:17:49 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestoQbvKB/job_e3d80fdf-b904-4837-9dd0-6b5ae8f911bd/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140375771576064)>

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-120, started daemon 140375754790656)>

# Thread: <_MainThread(MainThread, started 140376888485632)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 140375738005248)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apach# Thread: <Thread(Thread-126, started daemon 140375746397952)>

e_beam/runners/portability/portable_runner.py", line 431, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(Thread-120, started daemon 140375754790656)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574101056.13_c60635d1-3943-4bcb-bbc0-e75cd2af6673 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <Thread(wait_until_finish_read, started daemon 140375771576064)>

----------------------------------------------------------------------
Ran 38 tests in 304.932s

# Thread: <_MainThread(MainThread, started 140376888485632)>
FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 46s
59 actionable tasks: 47 executed, 12 from cache

Publishing build scan...
https://gradle.com/s/vldqzhea6rhxg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1564

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1564/display/redirect>

Changes:


------------------------------------------
[...truncated 1.62 MB...]
19/11/18 12:15:10 INFO main: Logging handler created.
19/11/18 12:15:10 INFO start: Status HTTP server running at localhost:38353
19/11/18 12:15:10 INFO main: semi_persistent_directory: /tmp
19/11/18 12:15:10 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/18 12:15:10 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574079307.92_744a01d1-3510-4495-9e54-761e34eff40d', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/18 12:15:10 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574079307.92', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50307'}
19/11/18 12:15:10 INFO __init__: Creating state cache with size 0
19/11/18 12:15:10 INFO __init__: Creating insecure control channel for localhost:33197.
19/11/18 12:15:10 INFO __init__: Control channel established.
19/11/18 12:15:10 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/18 12:15:10 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/18 12:15:10 INFO create_state_handler: Creating insecure state channel for localhost:40529.
19/11/18 12:15:10 INFO create_state_handler: State channel established.
19/11/18 12:15:10 INFO create_data_channel: Creating client data channel for localhost:33339
19/11/18 12:15:10 INFO GrpcDataService: Beam Fn Data client connected.
19/11/18 12:15:10 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/18 12:15:10 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
19/11/18 12:15:10 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/18 12:15:10 INFO run: No more requests from control plane
19/11/18 12:15:10 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/18 12:15:10 INFO close: Closing all cached grpc data channels.
19/11/18 12:15:10 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 12:15:10 INFO close: Closing all cached gRPC state handlers.
19/11/18 12:15:10 INFO run: Done consuming work.
19/11/18 12:15:10 INFO main: Python sdk harness exiting.
19/11/18 12:15:10 INFO GrpcLoggingService: Logging client hanged up.
19/11/18 12:15:10 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 12:15:10 INFO Executor: Finished task 0.0 in stage 131.0 (TID 158). 12763 bytes result sent to driver
19/11/18 12:15:10 INFO TaskSetManager: Finished task 0.0 in stage 131.0 (TID 158) in 845 ms on localhost (executor driver) (1/1)
19/11/18 12:15:10 INFO TaskSchedulerImpl: Removed TaskSet 131.0, whose tasks have all completed, from pool 
19/11/18 12:15:10 INFO DAGScheduler: ShuffleMapStage 131 (mapToPair at GroupCombineFunctions.java:55) finished in 0.850 s
19/11/18 12:15:10 INFO DAGScheduler: looking for newly runnable stages
19/11/18 12:15:10 INFO DAGScheduler: running: Set()
19/11/18 12:15:10 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/18 12:15:10 INFO DAGScheduler: failed: Set()
19/11/18 12:15:10 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/18 12:15:10 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/18 12:15:10 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.0 KB, free 13.5 GB)
19/11/18 12:15:10 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:37667 (size: 22.0 KB, free: 13.5 GB)
19/11/18 12:15:10 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/18 12:15:10 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/18 12:15:10 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/18 12:15:10 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 159, localhost, executor driver, partition 0, NODE_LOCAL, 7760 bytes)
19/11/18 12:15:10 INFO Executor: Running task 0.0 in stage 132.0 (TID 159)
19/11/18 12:15:10 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/18 12:15:10 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
19/11/18 12:15:10 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest2JbRnn/job_89465ce6-c5a4-4b4d-8711-3016fe8fea54/MANIFEST
19/11/18 12:15:10 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest2JbRnn/job_89465ce6-c5a4-4b4d-8711-3016fe8fea54/MANIFEST -> 0 artifacts
19/11/18 12:15:11 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/18 12:15:11 INFO main: Logging handler created.
19/11/18 12:15:11 INFO start: Status HTTP server running at localhost:39347
19/11/18 12:15:11 INFO main: semi_persistent_directory: /tmp
19/11/18 12:15:11 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/18 12:15:11 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574079307.92_744a01d1-3510-4495-9e54-761e34eff40d', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/18 12:15:11 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574079307.92', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50307'}
19/11/18 12:15:11 INFO __init__: Creating state cache with size 0
19/11/18 12:15:11 INFO __init__: Creating insecure control channel for localhost:43393.
19/11/18 12:15:11 INFO __init__: Control channel established.
19/11/18 12:15:11 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/18 12:15:11 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/18 12:15:11 INFO create_state_handler: Creating insecure state channel for localhost:45781.
19/11/18 12:15:11 INFO create_state_handler: State channel established.
19/11/18 12:15:11 INFO create_data_channel: Creating client data channel for localhost:39109
19/11/18 12:15:11 INFO GrpcDataService: Beam Fn Data client connected.
19/11/18 12:15:11 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/18 12:15:11 INFO run: No more requests from control plane
19/11/18 12:15:11 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/18 12:15:11 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 12:15:11 INFO close: Closing all cached grpc data channels.
19/11/18 12:15:11 INFO close: Closing all cached gRPC state handlers.
19/11/18 12:15:11 INFO run: Done consuming work.
19/11/18 12:15:11 INFO main: Python sdk harness exiting.
19/11/18 12:15:11 INFO GrpcLoggingService: Logging client hanged up.
19/11/18 12:15:11 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 12:15:11 INFO Executor: Finished task 0.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/18 12:15:11 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 160, localhost, executor driver, partition 1, PROCESS_LOCAL, 7977 bytes)
19/11/18 12:15:11 INFO Executor: Running task 1.0 in stage 132.0 (TID 160)
19/11/18 12:15:11 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 159) in 843 ms on localhost (executor driver) (1/2)
19/11/18 12:15:11 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest2JbRnn/job_89465ce6-c5a4-4b4d-8711-3016fe8fea54/MANIFEST
19/11/18 12:15:11 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest2JbRnn/job_89465ce6-c5a4-4b4d-8711-3016fe8fea54/MANIFEST -> 0 artifacts
19/11/18 12:15:12 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/18 12:15:12 INFO main: Logging handler created.
19/11/18 12:15:12 INFO start: Status HTTP server running at localhost:34165
19/11/18 12:15:12 INFO main: semi_persistent_directory: /tmp
19/11/18 12:15:12 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/18 12:15:12 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574079307.92_744a01d1-3510-4495-9e54-761e34eff40d', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/18 12:15:12 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574079307.92', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50307'}
19/11/18 12:15:12 INFO __init__: Creating state cache with size 0
19/11/18 12:15:12 INFO __init__: Creating insecure control channel for localhost:43285.
19/11/18 12:15:12 INFO __init__: Control channel established.
19/11/18 12:15:12 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/18 12:15:12 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/18 12:15:12 INFO create_state_handler: Creating insecure state channel for localhost:40345.
19/11/18 12:15:12 INFO create_state_handler: State channel established.
19/11/18 12:15:12 INFO create_data_channel: Creating client data channel for localhost:43365
19/11/18 12:15:12 INFO GrpcDataService: Beam Fn Data client connected.
19/11/18 12:15:12 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/18 12:15:12 INFO run: No more requests from control plane
19/11/18 12:15:12 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/18 12:15:12 INFO close: Closing all cached grpc data channels.
19/11/18 12:15:12 INFO close: Closing all cached gRPC state handlers.
19/11/18 12:15:12 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 12:15:12 INFO run: Done consuming work.
19/11/18 12:15:12 INFO main: Python sdk harness exiting.
19/11/18 12:15:12 INFO GrpcLoggingService: Logging client hanged up.
19/11/18 12:15:12 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 12:15:12 INFO Executor: Finished task 1.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/18 12:15:12 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 160) in 798 ms on localhost (executor driver) (2/2)
19/11/18 12:15:12 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/18 12:15:12 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.645 s
19/11/18 12:15:12 INFO DAGScheduler: looking for newly runnable stages
19/11/18 12:15:12 INFO DAGScheduler: running: Set()
19/11/18 12:15:12 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/18 12:15:12 INFO DAGScheduler: failed: Set()
19/11/18 12:15:12 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/18 12:15:12 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/18 12:15:12 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.3 KB, free 13.5 GB)
19/11/18 12:15:12 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:37667 (size: 12.3 KB, free: 13.5 GB)
19/11/18 12:15:12 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/18 12:15:12 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/18 12:15:12 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/18 12:15:12 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/18 12:15:12 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/18 12:15:12 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/18 12:15:12 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/18 12:15:12 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest2JbRnn/job_89465ce6-c5a4-4b4d-8711-3016fe8fea54/MANIFEST
19/11/18 12:15:12 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest2JbRnn/job_89465ce6-c5a4-4b4d-8711-3016fe8fea54/MANIFEST -> 0 artifacts
19/11/18 12:15:12 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/18 12:15:12 INFO main: Logging handler created.
19/11/18 12:15:12 INFO start: Status HTTP server running at localhost:37113
19/11/18 12:15:12 INFO main: semi_persistent_directory: /tmp
19/11/18 12:15:12 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/18 12:15:12 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574079307.92_744a01d1-3510-4495-9e54-761e34eff40d', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/18 12:15:12 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574079307.92', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50307'}
19/11/18 12:15:12 INFO __init__: Creating state cache with size 0
19/11/18 12:15:12 INFO __init__: Creating insecure control channel for localhost:34251.
19/11/18 12:15:12 INFO __init__: Control channel established.
19/11/18 12:15:12 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/18 12:15:12 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/18 12:15:12 INFO create_state_handler: Creating insecure state channel for localhost:44555.
19/11/18 12:15:12 INFO create_state_handler: State channel established.
19/11/18 12:15:12 INFO create_data_channel: Creating client data channel for localhost:37639
19/11/18 12:15:12 INFO GrpcDataService: Beam Fn Data client connected.
19/11/18 12:15:12 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/18 12:15:12 INFO run: No more requests from control plane
19/11/18 12:15:12 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/18 12:15:12 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 12:15:12 INFO close: Closing all cached grpc data channels.
19/11/18 12:15:12 INFO close: Closing all cached gRPC state handlers.
19/11/18 12:15:12 INFO run: Done consuming work.
19/11/18 12:15:12 INFO main: Python sdk harness exiting.
19/11/18 12:15:12 INFO GrpcLoggingService: Logging client hanged up.
19/11/18 12:15:12 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 12:15:13 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/18 12:15:13 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 803 ms on localhost (executor driver) (1/1)
19/11/18 12:15:13 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/18 12:15:13 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.809 s
19/11/18 12:15:13 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.099075 s
19/11/18 12:15:13 INFO SparkPipelineRunner: Job test_windowing_1574079307.92_744a01d1-3510-4495-9e54-761e34eff40d finished.
19/11/18 12:15:13 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/18 12:15:13 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktest2JbRnn/job_89465ce6-c5a4-4b4d-8711-3016fe8fea54/MANIFEST has 0 artifact locations
19/11/18 12:15:13 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktest2JbRnn/job_89465ce6-c5a4-4b4d-8711-3016fe8fea54/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 431, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574079299.37_4e4dd257-76f2-4343-b500-ba81a3ab3e67 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 298.053s

FAILED (errors=2, skipped=9)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139811944359680)>

# Thread: <Thread(Thread-120, started daemon 139812212619008)>

# Thread: <_MainThread(MainThread, started 139812731131648)>

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 45s
59 actionable tasks: 47 executed, 12 from cache

Publishing build scan...
https://gradle.com/s/jookktvmd423c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1563

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1563/display/redirect?page=changes>

Changes:

[thw] Fix sdk_worker_parallelism pipeline option type, add test


------------------------------------------
[...truncated 1.67 MB...]
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2752
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2552
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2542
19/11/18 10:25:23 INFO BlockManagerInfo: Removed broadcast_115_piece0 on localhost:33429 in memory (size: 9.0 KB, free: 13.5 GB)
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2813
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2812
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2774
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2824
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2550
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2725
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2487
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2500
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2736
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2926
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2910
19/11/18 10:25:23 INFO BlockManagerInfo: Removed broadcast_125_piece0 on localhost:33429 in memory (size: 21.9 KB, free: 13.5 GB)
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2574
19/11/18 10:25:23 INFO BlockManagerInfo: Removed broadcast_117_piece0 on localhost:33429 in memory (size: 14.2 KB, free: 13.5 GB)
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2509
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2529
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2567
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2630
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2671
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2535
19/11/18 10:25:23 INFO ContextCleaner: Cleaned shuffle 59
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2504
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2523
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2771
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2833
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2508
19/11/18 10:25:23 INFO BlockManagerInfo: Removed broadcast_111_piece0 on localhost:33429 in memory (size: 12.4 KB, free: 13.5 GB)
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2489
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2631
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2478
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2906
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2609
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2590
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2839
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2719
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2785
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2730
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2870
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2849
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2634
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2789
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2699
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2484
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2525
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2919
19/11/18 10:25:23 INFO ContextCleaner: Cleaned accumulator 2796
19/11/18 10:25:23 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/18 10:25:23 INFO main: Logging handler created.
19/11/18 10:25:23 INFO start: Status HTTP server running at localhost:35531
19/11/18 10:25:23 INFO main: semi_persistent_directory: /tmp
19/11/18 10:25:23 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/18 10:25:23 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574072720.48_c3c0b973-8aa9-42ae-a42a-4b6d38cb7256', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/18 10:25:23 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574072720.48', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42991'}
19/11/18 10:25:23 INFO __init__: Creating state cache with size 0
19/11/18 10:25:23 INFO __init__: Creating insecure control channel for localhost:40129.
19/11/18 10:25:23 INFO __init__: Control channel established.
19/11/18 10:25:23 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/18 10:25:23 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/18 10:25:23 INFO create_state_handler: Creating insecure state channel for localhost:36899.
19/11/18 10:25:23 INFO create_state_handler: State channel established.
19/11/18 10:25:23 INFO create_data_channel: Creating client data channel for localhost:41267
19/11/18 10:25:23 INFO GrpcDataService: Beam Fn Data client connected.
19/11/18 10:25:23 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/18 10:25:23 INFO run: No more requests from control plane
19/11/18 10:25:23 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/18 10:25:23 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 10:25:23 INFO close: Closing all cached grpc data channels.
19/11/18 10:25:23 INFO close: Closing all cached gRPC state handlers.
19/11/18 10:25:23 INFO run: Done consuming work.
19/11/18 10:25:23 INFO main: Python sdk harness exiting.
19/11/18 10:25:23 INFO GrpcLoggingService: Logging client hanged up.
19/11/18 10:25:24 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 10:25:24 INFO Executor: Finished task 1.0 in stage 132.0 (TID 159). 15272 bytes result sent to driver
19/11/18 10:25:24 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 160, localhost, executor driver, partition 0, PROCESS_LOCAL, 7977 bytes)
19/11/18 10:25:24 INFO Executor: Running task 0.0 in stage 132.0 (TID 160)
19/11/18 10:25:24 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 159) in 910 ms on localhost (executor driver) (1/2)
19/11/18 10:25:24 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestklF8ak/job_8e149195-ab77-41ea-aa15-24cdfa67fd1a/MANIFEST
19/11/18 10:25:24 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestklF8ak/job_8e149195-ab77-41ea-aa15-24cdfa67fd1a/MANIFEST -> 0 artifacts
19/11/18 10:25:24 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/18 10:25:24 INFO main: Logging handler created.
19/11/18 10:25:24 INFO start: Status HTTP server running at localhost:46549
19/11/18 10:25:24 INFO main: semi_persistent_directory: /tmp
19/11/18 10:25:24 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/18 10:25:24 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574072720.48_c3c0b973-8aa9-42ae-a42a-4b6d38cb7256', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/18 10:25:24 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574072720.48', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42991'}
19/11/18 10:25:24 INFO __init__: Creating state cache with size 0
19/11/18 10:25:24 INFO __init__: Creating insecure control channel for localhost:33297.
19/11/18 10:25:24 INFO __init__: Control channel established.
19/11/18 10:25:24 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/18 10:25:24 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/18 10:25:24 INFO create_state_handler: Creating insecure state channel for localhost:44379.
19/11/18 10:25:24 INFO create_state_handler: State channel established.
19/11/18 10:25:24 INFO create_data_channel: Creating client data channel for localhost:40877
19/11/18 10:25:24 INFO GrpcDataService: Beam Fn Data client connected.
19/11/18 10:25:24 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/18 10:25:24 INFO run: No more requests from control plane
19/11/18 10:25:24 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 10:25:24 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/18 10:25:24 INFO close: Closing all cached grpc data channels.
19/11/18 10:25:24 INFO close: Closing all cached gRPC state handlers.
19/11/18 10:25:24 INFO run: Done consuming work.
19/11/18 10:25:24 INFO main: Python sdk harness exiting.
19/11/18 10:25:24 INFO GrpcLoggingService: Logging client hanged up.
19/11/18 10:25:24 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 10:25:24 INFO Executor: Finished task 0.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/18 10:25:24 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 160) in 857 ms on localhost (executor driver) (2/2)
19/11/18 10:25:24 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/18 10:25:24 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.771 s
19/11/18 10:25:24 INFO DAGScheduler: looking for newly runnable stages
19/11/18 10:25:24 INFO DAGScheduler: running: Set()
19/11/18 10:25:24 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/18 10:25:24 INFO DAGScheduler: failed: Set()
19/11/18 10:25:24 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/18 10:25:24 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/18 10:25:24 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.4 KB, free 13.5 GB)
19/11/18 10:25:24 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:33429 (size: 12.4 KB, free: 13.5 GB)
19/11/18 10:25:24 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/18 10:25:24 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/18 10:25:24 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/18 10:25:24 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/18 10:25:24 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/18 10:25:24 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/18 10:25:24 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/18 10:25:24 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestklF8ak/job_8e149195-ab77-41ea-aa15-24cdfa67fd1a/MANIFEST
19/11/18 10:25:24 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestklF8ak/job_8e149195-ab77-41ea-aa15-24cdfa67fd1a/MANIFEST -> 0 artifacts
19/11/18 10:25:25 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/18 10:25:25 INFO main: Logging handler created.
19/11/18 10:25:25 INFO start: Status HTTP server running at localhost:41131
19/11/18 10:25:25 INFO main: semi_persistent_directory: /tmp
19/11/18 10:25:25 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/18 10:25:25 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574072720.48_c3c0b973-8aa9-42ae-a42a-4b6d38cb7256', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/18 10:25:25 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574072720.48', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42991'}
19/11/18 10:25:25 INFO __init__: Creating state cache with size 0
19/11/18 10:25:25 INFO __init__: Creating insecure control channel for localhost:43825.
19/11/18 10:25:25 INFO __init__: Control channel established.
19/11/18 10:25:25 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/18 10:25:25 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/18 10:25:25 INFO create_state_handler: Creating insecure state channel for localhost:34815.
19/11/18 10:25:25 INFO create_state_handler: State channel established.
19/11/18 10:25:25 INFO create_data_channel: Creating client data channel for localhost:44145
19/11/18 10:25:25 INFO GrpcDataService: Beam Fn Data client connected.
19/11/18 10:25:25 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/18 10:25:25 INFO run: No more requests from control plane
19/11/18 10:25:25 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/18 10:25:25 INFO close: Closing all cached grpc data channels.
19/11/18 10:25:25 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 10:25:25 INFO close: Closing all cached gRPC state handlers.
19/11/18 10:25:25 INFO run: Done consuming work.
19/11/18 10:25:25 INFO main: Python sdk harness exiting.
19/11/18 10:25:25 INFO GrpcLoggingService: Logging client hanged up.
19/11/18 10:25:25 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 10:25:25 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/18 10:25:25 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 837 ms on localhost (executor driver) (1/1)
19/11/18 10:25:25 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/18 10:25:25 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.842 s
19/11/18 10:25:25 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.257861 s
19/11/18 10:25:25 INFO SparkPipelineRunner: Job test_windowing_1574072720.48_c3c0b973-8aa9-42ae-a42a-4b6d38cb7256 finished.
19/11/18 10:25:25 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/18 10:25:25 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestklF8ak/job_8e149195-ab77-41ea-aa15-24cdfa67fd1a/MANIFEST has 0 artifact locations
19/11/18 10:25:25 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestklF8ak/job_8e149195-ab77-41ea-aa15-24cdfa67fd1a/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
==================== Timed out after 60 seconds. ====================
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)

# Thread: <Thread(wait_until_finish_read, started daemon 139827014002432)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(Thread-120, started daemon 139827022395136)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 139828151654144)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 431, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574072711.38_b1dcae71-87e2-4677-af80-0a0e9b8a6560 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 296.397s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 34s
59 actionable tasks: 53 executed, 6 from cache

Publishing build scan...
https://gradle.com/s/kcke24o3y2oac

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1562

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1562/display/redirect>

Changes:


------------------------------------------
[...truncated 1.65 MB...]
19/11/18 06:12:06 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/18 06:12:06 INFO DAGScheduler: failed: Set()
19/11/18 06:12:06 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/18 06:12:06 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/18 06:12:06 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.0 KB, free 13.5 GB)
19/11/18 06:12:06 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:34975 (size: 22.0 KB, free: 13.5 GB)
19/11/18 06:12:06 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/18 06:12:06 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/18 06:12:06 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/18 06:12:06 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 159, localhost, executor driver, partition 0, NODE_LOCAL, 7760 bytes)
19/11/18 06:12:06 INFO Executor: Running task 0.0 in stage 132.0 (TID 159)
19/11/18 06:12:06 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/18 06:12:06 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/18 06:12:06 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestxy2b3G/job_ee25e0b9-1d3b-41f7-ae36-26a67067701c/MANIFEST
19/11/18 06:12:06 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestxy2b3G/job_ee25e0b9-1d3b-41f7-ae36-26a67067701c/MANIFEST -> 0 artifacts
19/11/18 06:12:07 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/18 06:12:07 INFO main: Logging handler created.
19/11/18 06:12:07 INFO start: Status HTTP server running at localhost:42971
19/11/18 06:12:07 INFO main: semi_persistent_directory: /tmp
19/11/18 06:12:07 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/18 06:12:07 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574057523.26_d4919793-0705-495e-b073-38077e08fe43', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/18 06:12:07 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574057523.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52085'}
19/11/18 06:12:07 INFO __init__: Creating state cache with size 0
19/11/18 06:12:07 INFO __init__: Creating insecure control channel for localhost:45409.
19/11/18 06:12:07 INFO __init__: Control channel established.
19/11/18 06:12:07 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/18 06:12:07 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/18 06:12:07 INFO create_state_handler: Creating insecure state channel for localhost:40481.
19/11/18 06:12:07 INFO create_state_handler: State channel established.
19/11/18 06:12:07 INFO create_data_channel: Creating client data channel for localhost:46267
19/11/18 06:12:07 INFO GrpcDataService: Beam Fn Data client connected.
19/11/18 06:12:07 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/18 06:12:07 INFO run: No more requests from control plane
19/11/18 06:12:07 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/18 06:12:07 INFO close: Closing all cached grpc data channels.
19/11/18 06:12:07 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 06:12:07 INFO close: Closing all cached gRPC state handlers.
19/11/18 06:12:07 INFO run: Done consuming work.
19/11/18 06:12:07 INFO main: Python sdk harness exiting.
19/11/18 06:12:07 INFO GrpcLoggingService: Logging client hanged up.
19/11/18 06:12:07 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 06:12:07 INFO Executor: Finished task 0.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/18 06:12:07 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 160, localhost, executor driver, partition 1, PROCESS_LOCAL, 7977 bytes)
19/11/18 06:12:07 INFO Executor: Running task 1.0 in stage 132.0 (TID 160)
19/11/18 06:12:07 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 159) in 991 ms on localhost (executor driver) (1/2)
19/11/18 06:12:07 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestxy2b3G/job_ee25e0b9-1d3b-41f7-ae36-26a67067701c/MANIFEST
19/11/18 06:12:07 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestxy2b3G/job_ee25e0b9-1d3b-41f7-ae36-26a67067701c/MANIFEST -> 0 artifacts
19/11/18 06:12:08 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/18 06:12:08 INFO main: Logging handler created.
19/11/18 06:12:08 INFO start: Status HTTP server running at localhost:43085
19/11/18 06:12:08 INFO main: semi_persistent_directory: /tmp
19/11/18 06:12:08 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/18 06:12:08 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574057523.26_d4919793-0705-495e-b073-38077e08fe43', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/18 06:12:08 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574057523.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52085'}
19/11/18 06:12:08 INFO __init__: Creating state cache with size 0
19/11/18 06:12:08 INFO __init__: Creating insecure control channel for localhost:46111.
19/11/18 06:12:08 INFO __init__: Control channel established.
19/11/18 06:12:08 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/18 06:12:08 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/18 06:12:08 INFO create_state_handler: Creating insecure state channel for localhost:44779.
19/11/18 06:12:08 INFO create_state_handler: State channel established.
19/11/18 06:12:08 INFO create_data_channel: Creating client data channel for localhost:36741
19/11/18 06:12:08 INFO GrpcDataService: Beam Fn Data client connected.
19/11/18 06:12:08 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/18 06:12:08 INFO run: No more requests from control plane
19/11/18 06:12:08 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/18 06:12:08 INFO close: Closing all cached grpc data channels.
19/11/18 06:12:08 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 06:12:08 INFO close: Closing all cached gRPC state handlers.
19/11/18 06:12:08 INFO run: Done consuming work.
19/11/18 06:12:08 INFO main: Python sdk harness exiting.
19/11/18 06:12:08 INFO GrpcLoggingService: Logging client hanged up.
19/11/18 06:12:08 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 06:12:08 INFO Executor: Finished task 1.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/18 06:12:08 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 160) in 953 ms on localhost (executor driver) (2/2)
19/11/18 06:12:08 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/18 06:12:08 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.951 s
19/11/18 06:12:08 INFO DAGScheduler: looking for newly runnable stages
19/11/18 06:12:08 INFO DAGScheduler: running: Set()
19/11/18 06:12:08 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/18 06:12:08 INFO DAGScheduler: failed: Set()
19/11/18 06:12:08 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/18 06:12:08 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/18 06:12:08 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.3 KB, free 13.5 GB)
19/11/18 06:12:08 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:34975 (size: 12.3 KB, free: 13.5 GB)
19/11/18 06:12:08 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/18 06:12:08 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/18 06:12:08 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/18 06:12:08 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/18 06:12:08 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/18 06:12:08 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/18 06:12:08 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/18 06:12:08 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestxy2b3G/job_ee25e0b9-1d3b-41f7-ae36-26a67067701c/MANIFEST
19/11/18 06:12:08 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestxy2b3G/job_ee25e0b9-1d3b-41f7-ae36-26a67067701c/MANIFEST -> 0 artifacts
19/11/18 06:12:09 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/18 06:12:09 INFO main: Logging handler created.
19/11/18 06:12:09 INFO start: Status HTTP server running at localhost:33515
19/11/18 06:12:09 INFO main: semi_persistent_directory: /tmp
19/11/18 06:12:09 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/18 06:12:09 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574057523.26_d4919793-0705-495e-b073-38077e08fe43', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/18 06:12:09 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574057523.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52085'}
19/11/18 06:12:09 INFO __init__: Creating state cache with size 0
19/11/18 06:12:09 INFO __init__: Creating insecure control channel for localhost:37479.
19/11/18 06:12:09 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/18 06:12:09 INFO __init__: Control channel established.
19/11/18 06:12:09 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/18 06:12:09 INFO create_state_handler: Creating insecure state channel for localhost:36629.
19/11/18 06:12:09 INFO create_state_handler: State channel established.
19/11/18 06:12:09 INFO create_data_channel: Creating client data channel for localhost:42545
19/11/18 06:12:09 INFO GrpcDataService: Beam Fn Data client connected.
19/11/18 06:12:09 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/18 06:12:09 INFO run: No more requests from control plane
19/11/18 06:12:09 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/18 06:12:09 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 06:12:09 INFO close: Closing all cached grpc data channels.
19/11/18 06:12:09 INFO close: Closing all cached gRPC state handlers.
19/11/18 06:12:09 INFO run: Done consuming work.
19/11/18 06:12:09 INFO main: Python sdk harness exiting.
19/11/18 06:12:09 INFO GrpcLoggingService: Logging client hanged up.
19/11/18 06:12:09 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 06:12:09 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/18 06:12:09 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 871 ms on localhost (executor driver) (1/1)
19/11/18 06:12:09 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/18 06:12:09 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.878 s
19/11/18 06:12:09 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.866223 s
19/11/18 06:12:09 INFO SparkPipelineRunner: Job test_windowing_1574057523.26_d4919793-0705-495e-b073-38077e08fe43 finished.
19/11/18 06:12:09 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/18 06:12:09 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestxy2b3G/job_ee25e0b9-1d3b-41f7-ae36-26a67067701c/MANIFEST has 0 artifact locations
19/11/18 06:12:09 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestxy2b3G/job_ee25e0b9-1d3b-41f7-ae36-26a67067701c/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
==================== Timed out after 60 seconds. ====================

    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 140277897156352)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(Thread-119, started daemon 140277880370944)>

# Thread: <_MainThread(MainThread, started 140278676596480)>
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
# Thread: <Thread(wait_until_finish_read, started daemon 140277255763712)>

BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apach# Thread: <Thread(Thread-125, started daemon 140277870929664)>

e_beam/runners/portability/portable_runner.py", line 431, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(Thread-119, started daemon 140277880370944)>

# Thread: <_MainThread(MainThread, started 140278676596480)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574057513.48_78ba2913-4efa-4693-9ae6-98d1f3f809a2 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 318.300s

FAILED (errors=3, skipped=9)
# Thread: <Thread(wait_until_finish_read, started daemon 140277897156352)>

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 18s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/ctek5bt2z6u2g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1561

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1561/display/redirect>

Changes:


------------------------------------------
[...truncated 1.66 MB...]
19/11/18 00:12:42 INFO main: Logging handler created.
19/11/18 00:12:42 INFO start: Status HTTP server running at localhost:34209
19/11/18 00:12:42 INFO main: semi_persistent_directory: /tmp
19/11/18 00:12:42 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/18 00:12:42 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574035959.65_610e63b0-2c85-4c59-a8f5-bca712383165', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/18 00:12:42 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574035959.65', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53521'}
19/11/18 00:12:42 INFO __init__: Creating state cache with size 0
19/11/18 00:12:42 INFO __init__: Creating insecure control channel for localhost:42727.
19/11/18 00:12:42 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/18 00:12:42 INFO __init__: Control channel established.
19/11/18 00:12:42 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/18 00:12:42 INFO create_state_handler: Creating insecure state channel for localhost:45587.
19/11/18 00:12:42 INFO create_state_handler: State channel established.
19/11/18 00:12:42 INFO create_data_channel: Creating client data channel for localhost:36159
19/11/18 00:12:42 INFO GrpcDataService: Beam Fn Data client connected.
19/11/18 00:12:42 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/18 00:12:42 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
19/11/18 00:12:42 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/18 00:12:42 INFO run: No more requests from control plane
19/11/18 00:12:42 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/18 00:12:42 INFO close: Closing all cached grpc data channels.
19/11/18 00:12:42 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 00:12:42 INFO close: Closing all cached gRPC state handlers.
19/11/18 00:12:42 INFO run: Done consuming work.
19/11/18 00:12:42 INFO main: Python sdk harness exiting.
19/11/18 00:12:42 INFO GrpcLoggingService: Logging client hanged up.
19/11/18 00:12:42 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 00:12:42 INFO Executor: Finished task 0.0 in stage 131.0 (TID 158). 12763 bytes result sent to driver
19/11/18 00:12:42 INFO TaskSetManager: Finished task 0.0 in stage 131.0 (TID 158) in 885 ms on localhost (executor driver) (1/1)
19/11/18 00:12:42 INFO TaskSchedulerImpl: Removed TaskSet 131.0, whose tasks have all completed, from pool 
19/11/18 00:12:42 INFO DAGScheduler: ShuffleMapStage 131 (mapToPair at GroupCombineFunctions.java:55) finished in 0.892 s
19/11/18 00:12:42 INFO DAGScheduler: looking for newly runnable stages
19/11/18 00:12:42 INFO DAGScheduler: running: Set()
19/11/18 00:12:42 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/18 00:12:42 INFO DAGScheduler: failed: Set()
19/11/18 00:12:42 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/18 00:12:42 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/18 00:12:42 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.9 KB, free 13.5 GB)
19/11/18 00:12:42 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:46625 (size: 22.9 KB, free: 13.5 GB)
19/11/18 00:12:42 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/18 00:12:42 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/18 00:12:42 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/18 00:12:42 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 159, localhost, executor driver, partition 1, NODE_LOCAL, 7760 bytes)
19/11/18 00:12:42 INFO Executor: Running task 1.0 in stage 132.0 (TID 159)
19/11/18 00:12:42 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/18 00:12:42 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/18 00:12:42 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestUp_W0c/job_215f270c-b71e-42f2-b233-730ea059a153/MANIFEST
19/11/18 00:12:42 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestUp_W0c/job_215f270c-b71e-42f2-b233-730ea059a153/MANIFEST -> 0 artifacts
19/11/18 00:12:43 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/18 00:12:43 INFO main: Logging handler created.
19/11/18 00:12:43 INFO start: Status HTTP server running at localhost:41387
19/11/18 00:12:43 INFO main: semi_persistent_directory: /tmp
19/11/18 00:12:43 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/18 00:12:43 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574035959.65_610e63b0-2c85-4c59-a8f5-bca712383165', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/18 00:12:43 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574035959.65', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53521'}
19/11/18 00:12:43 INFO __init__: Creating state cache with size 0
19/11/18 00:12:43 INFO __init__: Creating insecure control channel for localhost:43815.
19/11/18 00:12:43 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/18 00:12:43 INFO __init__: Control channel established.
19/11/18 00:12:43 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/18 00:12:43 INFO create_state_handler: Creating insecure state channel for localhost:41097.
19/11/18 00:12:43 INFO create_state_handler: State channel established.
19/11/18 00:12:43 INFO create_data_channel: Creating client data channel for localhost:42249
19/11/18 00:12:43 INFO GrpcDataService: Beam Fn Data client connected.
19/11/18 00:12:43 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/18 00:12:43 INFO run: No more requests from control plane
19/11/18 00:12:43 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/18 00:12:43 INFO close: Closing all cached grpc data channels.
19/11/18 00:12:43 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 00:12:43 INFO close: Closing all cached gRPC state handlers.
19/11/18 00:12:43 INFO run: Done consuming work.
19/11/18 00:12:43 INFO main: Python sdk harness exiting.
19/11/18 00:12:43 INFO GrpcLoggingService: Logging client hanged up.
19/11/18 00:12:43 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 00:12:43 INFO Executor: Finished task 1.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/18 00:12:43 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 160, localhost, executor driver, partition 0, PROCESS_LOCAL, 7977 bytes)
19/11/18 00:12:43 INFO Executor: Running task 0.0 in stage 132.0 (TID 160)
19/11/18 00:12:43 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 159) in 907 ms on localhost (executor driver) (1/2)
19/11/18 00:12:43 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestUp_W0c/job_215f270c-b71e-42f2-b233-730ea059a153/MANIFEST
19/11/18 00:12:43 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestUp_W0c/job_215f270c-b71e-42f2-b233-730ea059a153/MANIFEST -> 0 artifacts
19/11/18 00:12:43 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/18 00:12:43 INFO main: Logging handler created.
19/11/18 00:12:43 INFO start: Status HTTP server running at localhost:34725
19/11/18 00:12:43 INFO main: semi_persistent_directory: /tmp
19/11/18 00:12:43 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/18 00:12:43 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574035959.65_610e63b0-2c85-4c59-a8f5-bca712383165', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/18 00:12:43 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574035959.65', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53521'}
19/11/18 00:12:43 INFO __init__: Creating state cache with size 0
19/11/18 00:12:43 INFO __init__: Creating insecure control channel for localhost:43301.
19/11/18 00:12:43 INFO __init__: Control channel established.
19/11/18 00:12:43 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/18 00:12:43 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/18 00:12:43 INFO create_state_handler: Creating insecure state channel for localhost:44977.
19/11/18 00:12:43 INFO create_state_handler: State channel established.
19/11/18 00:12:43 INFO create_data_channel: Creating client data channel for localhost:40843
19/11/18 00:12:43 INFO GrpcDataService: Beam Fn Data client connected.
19/11/18 00:12:44 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/18 00:12:44 INFO run: No more requests from control plane
19/11/18 00:12:44 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/18 00:12:44 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 00:12:44 INFO close: Closing all cached grpc data channels.
19/11/18 00:12:44 INFO close: Closing all cached gRPC state handlers.
19/11/18 00:12:44 INFO run: Done consuming work.
19/11/18 00:12:44 INFO main: Python sdk harness exiting.
19/11/18 00:12:44 INFO GrpcLoggingService: Logging client hanged up.
19/11/18 00:12:44 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 00:12:44 INFO Executor: Finished task 0.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/18 00:12:44 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 160) in 814 ms on localhost (executor driver) (2/2)
19/11/18 00:12:44 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/18 00:12:44 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.726 s
19/11/18 00:12:44 INFO DAGScheduler: looking for newly runnable stages
19/11/18 00:12:44 INFO DAGScheduler: running: Set()
19/11/18 00:12:44 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/18 00:12:44 INFO DAGScheduler: failed: Set()
19/11/18 00:12:44 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/18 00:12:44 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/18 00:12:44 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.4 KB, free 13.5 GB)
19/11/18 00:12:44 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:46625 (size: 12.4 KB, free: 13.5 GB)
19/11/18 00:12:44 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/18 00:12:44 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/18 00:12:44 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/18 00:12:44 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/18 00:12:44 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/18 00:12:44 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/18 00:12:44 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/18 00:12:44 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestUp_W0c/job_215f270c-b71e-42f2-b233-730ea059a153/MANIFEST
19/11/18 00:12:44 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestUp_W0c/job_215f270c-b71e-42f2-b233-730ea059a153/MANIFEST -> 0 artifacts
19/11/18 00:12:44 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/18 00:12:44 INFO main: Logging handler created.
19/11/18 00:12:44 INFO start: Status HTTP server running at localhost:40879
19/11/18 00:12:44 INFO main: semi_persistent_directory: /tmp
19/11/18 00:12:44 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/18 00:12:44 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574035959.65_610e63b0-2c85-4c59-a8f5-bca712383165', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/18 00:12:44 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574035959.65', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53521'}
19/11/18 00:12:44 INFO __init__: Creating state cache with size 0
19/11/18 00:12:44 INFO __init__: Creating insecure control channel for localhost:41213.
19/11/18 00:12:44 INFO __init__: Control channel established.
19/11/18 00:12:44 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/18 00:12:44 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/18 00:12:44 INFO create_state_handler: Creating insecure state channel for localhost:43425.
19/11/18 00:12:44 INFO create_state_handler: State channel established.
19/11/18 00:12:44 INFO create_data_channel: Creating client data channel for localhost:45243
19/11/18 00:12:44 INFO GrpcDataService: Beam Fn Data client connected.
19/11/18 00:12:44 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/18 00:12:44 INFO run: No more requests from control plane
19/11/18 00:12:44 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/18 00:12:44 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 00:12:44 INFO close: Closing all cached grpc data channels.
19/11/18 00:12:44 INFO close: Closing all cached gRPC state handlers.
19/11/18 00:12:44 INFO run: Done consuming work.
19/11/18 00:12:44 INFO main: Python sdk harness exiting.
19/11/18 00:12:44 INFO GrpcLoggingService: Logging client hanged up.
19/11/18 00:12:44 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/18 00:12:44 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/18 00:12:44 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 825 ms on localhost (executor driver) (1/1)
19/11/18 00:12:44 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/18 00:12:44 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.831 s
19/11/18 00:12:44 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.281423 s
19/11/18 00:12:44 INFO SparkPipelineRunner: Job test_windowing_1574035959.65_610e63b0-2c85-4c59-a8f5-bca712383165 finished.
19/11/18 00:12:44 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/18 00:12:44 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestUp_W0c/job_215f270c-b71e-42f2-b233-730ea059a153/MANIFEST has 0 artifact locations
19/11/18 00:12:44 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestUp_W0c/job_215f270c-b71e-42f2-b233-730ea059a153/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140447873971968)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
# Thread: <Thread(Thread-119, started daemon 140447890757376)>

# Thread: <_MainThread(MainThread, started 140448670197504)>
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 431, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574035950.42_181c083b-faa2-43ad-ac03-c5be5e206d96 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 290.223s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 18s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/qd2bgioczkfye

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1560

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1560/display/redirect>

Changes:


------------------------------------------
[...truncated 1.64 MB...]
19/11/17 18:12:07 INFO TaskSchedulerImpl: Removed TaskSet 131.0, whose tasks have all completed, from pool 
19/11/17 18:12:07 INFO DAGScheduler: ShuffleMapStage 131 (mapToPair at GroupCombineFunctions.java:55) finished in 0.829 s
19/11/17 18:12:07 INFO DAGScheduler: looking for newly runnable stages
19/11/17 18:12:07 INFO DAGScheduler: running: Set()
19/11/17 18:12:07 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/17 18:12:07 INFO DAGScheduler: failed: Set()
19/11/17 18:12:07 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/17 18:12:07 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.4 GB)
19/11/17 18:12:07 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.9 KB, free 13.4 GB)
19/11/17 18:12:07 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:39369 (size: 22.9 KB, free: 13.4 GB)
19/11/17 18:12:07 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/17 18:12:07 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/17 18:12:07 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/17 18:12:07 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 159, localhost, executor driver, partition 1, NODE_LOCAL, 7760 bytes)
19/11/17 18:12:07 INFO Executor: Running task 1.0 in stage 132.0 (TID 159)
19/11/17 18:12:07 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/17 18:12:07 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/17 18:12:07 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestPlE6iO/job_8a74c158-ac8d-4349-a8e2-68f2322ccb0f/MANIFEST
19/11/17 18:12:07 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestPlE6iO/job_8a74c158-ac8d-4349-a8e2-68f2322ccb0f/MANIFEST -> 0 artifacts
19/11/17 18:12:08 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/17 18:12:08 INFO main: Logging handler created.
19/11/17 18:12:08 INFO start: Status HTTP server running at localhost:35163
19/11/17 18:12:08 INFO main: semi_persistent_directory: /tmp
19/11/17 18:12:08 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/17 18:12:08 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574014324.92_28b022f7-4863-4c0b-a02d-b6cba8c7edc5', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/17 18:12:08 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574014324.92', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46105'}
19/11/17 18:12:08 INFO __init__: Creating state cache with size 0
19/11/17 18:12:08 INFO __init__: Creating insecure control channel for localhost:42665.
19/11/17 18:12:08 INFO __init__: Control channel established.
19/11/17 18:12:08 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/17 18:12:08 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/17 18:12:08 INFO create_state_handler: Creating insecure state channel for localhost:43717.
19/11/17 18:12:08 INFO create_state_handler: State channel established.
19/11/17 18:12:08 INFO create_data_channel: Creating client data channel for localhost:36593
19/11/17 18:12:08 INFO GrpcDataService: Beam Fn Data client connected.
19/11/17 18:12:08 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/17 18:12:08 INFO run: No more requests from control plane
19/11/17 18:12:08 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/17 18:12:08 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/17 18:12:08 INFO close: Closing all cached grpc data channels.
19/11/17 18:12:08 INFO close: Closing all cached gRPC state handlers.
19/11/17 18:12:08 INFO run: Done consuming work.
19/11/17 18:12:08 INFO main: Python sdk harness exiting.
19/11/17 18:12:08 INFO GrpcLoggingService: Logging client hanged up.
19/11/17 18:12:08 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/17 18:12:08 INFO Executor: Finished task 1.0 in stage 132.0 (TID 159). 15272 bytes result sent to driver
19/11/17 18:12:08 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 160, localhost, executor driver, partition 0, PROCESS_LOCAL, 7977 bytes)
19/11/17 18:12:08 INFO Executor: Running task 0.0 in stage 132.0 (TID 160)
19/11/17 18:12:08 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 159) in 849 ms on localhost (executor driver) (1/2)
19/11/17 18:12:08 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestPlE6iO/job_8a74c158-ac8d-4349-a8e2-68f2322ccb0f/MANIFEST
19/11/17 18:12:08 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestPlE6iO/job_8a74c158-ac8d-4349-a8e2-68f2322ccb0f/MANIFEST -> 0 artifacts
19/11/17 18:12:09 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/17 18:12:09 INFO main: Logging handler created.
19/11/17 18:12:09 INFO main: semi_persistent_directory: /tmp
19/11/17 18:12:09 INFO start: Status HTTP server running at localhost:33065
19/11/17 18:12:09 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/17 18:12:09 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574014324.92_28b022f7-4863-4c0b-a02d-b6cba8c7edc5', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/17 18:12:09 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574014324.92', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46105'}
19/11/17 18:12:09 INFO __init__: Creating state cache with size 0
19/11/17 18:12:09 INFO __init__: Creating insecure control channel for localhost:43589.
19/11/17 18:12:09 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/17 18:12:09 INFO __init__: Control channel established.
19/11/17 18:12:09 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/17 18:12:09 INFO create_state_handler: Creating insecure state channel for localhost:39289.
19/11/17 18:12:09 INFO create_state_handler: State channel established.
19/11/17 18:12:09 INFO GrpcDataService: Beam Fn Data client connected.
19/11/17 18:12:09 INFO create_data_channel: Creating client data channel for localhost:44133
19/11/17 18:12:09 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/17 18:12:09 INFO run: No more requests from control plane
19/11/17 18:12:09 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/17 18:12:09 INFO close: Closing all cached grpc data channels.
19/11/17 18:12:09 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/17 18:12:09 INFO close: Closing all cached gRPC state handlers.
19/11/17 18:12:09 INFO run: Done consuming work.
19/11/17 18:12:09 INFO main: Python sdk harness exiting.
19/11/17 18:12:09 INFO GrpcLoggingService: Logging client hanged up.
19/11/17 18:12:09 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/17 18:12:09 INFO Executor: Finished task 0.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/17 18:12:09 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 160) in 1042 ms on localhost (executor driver) (2/2)
19/11/17 18:12:09 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/17 18:12:09 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.897 s
19/11/17 18:12:09 INFO DAGScheduler: looking for newly runnable stages
19/11/17 18:12:09 INFO DAGScheduler: running: Set()
19/11/17 18:12:09 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/17 18:12:09 INFO DAGScheduler: failed: Set()
19/11/17 18:12:09 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/17 18:12:09 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.4 GB)
19/11/17 18:12:09 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.3 KB, free 13.4 GB)
19/11/17 18:12:09 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:39369 (size: 12.3 KB, free: 13.4 GB)
19/11/17 18:12:09 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/17 18:12:09 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/17 18:12:09 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/17 18:12:09 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/17 18:12:09 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/17 18:12:09 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/17 18:12:09 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/17 18:12:09 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestPlE6iO/job_8a74c158-ac8d-4349-a8e2-68f2322ccb0f/MANIFEST
19/11/17 18:12:09 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestPlE6iO/job_8a74c158-ac8d-4349-a8e2-68f2322ccb0f/MANIFEST -> 0 artifacts
19/11/17 18:12:10 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/17 18:12:10 INFO main: Logging handler created.
19/11/17 18:12:10 INFO start: Status HTTP server running at localhost:45715
19/11/17 18:12:10 INFO main: semi_persistent_directory: /tmp
19/11/17 18:12:10 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/17 18:12:10 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1574014324.92_28b022f7-4863-4c0b-a02d-b6cba8c7edc5', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/17 18:12:10 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574014324.92', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46105'}
19/11/17 18:12:10 INFO __init__: Creating state cache with size 0
19/11/17 18:12:10 INFO __init__: Creating insecure control channel for localhost:35643.
19/11/17 18:12:10 INFO __init__: Control channel established.
19/11/17 18:12:10 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/17 18:12:10 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/17 18:12:10 INFO create_state_handler: Creating insecure state channel for localhost:41541.
19/11/17 18:12:10 INFO create_state_handler: State channel established.
19/11/17 18:12:10 INFO create_data_channel: Creating client data channel for localhost:34603
19/11/17 18:12:10 INFO GrpcDataService: Beam Fn Data client connected.
19/11/17 18:12:10 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/17 18:12:10 INFO run: No more requests from control plane
19/11/17 18:12:10 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/17 18:12:10 INFO close: Closing all cached grpc data channels.
19/11/17 18:12:10 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/17 18:12:10 INFO close: Closing all cached gRPC state handlers.
19/11/17 18:12:10 INFO run: Done consuming work.
19/11/17 18:12:10 INFO main: Python sdk harness exiting.
19/11/17 18:12:10 INFO GrpcLoggingService: Logging client hanged up.
19/11/17 18:12:10 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/17 18:12:10 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/17 18:12:10 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 941 ms on localhost (executor driver) (1/1)
19/11/17 18:12:10 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/17 18:12:10 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.946 s
19/11/17 18:12:10 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.472771 s
19/11/17 18:12:10 INFO SparkPipelineRunner: Job test_windowing_1574014324.92_28b022f7-4863-4c0b-a02d-b6cba8c7edc5 finished.
19/11/17 18:12:10 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/17 18:12:10 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestPlE6iO/job_8a74c158-ac8d-4349-a8e2-68f2322ccb0f/MANIFEST has 0 artifact locations
19/11/17 18:12:10 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestPlE6iO/job_8a74c158-ac8d-4349-a8e2-68f2322ccb0f/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139684371199744)>

----------------------------------------------------------------------# Thread: <Thread(Thread-120, started daemon 139684379592448)>

Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:

# Thread: <_MainThread(MainThread, started 139685159032576)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 139683870795520)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-126, started daemon 139683879188224)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 139685159032576)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 431, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574014315.23_23066a22-9166-4260-a2bd-e6e4c0845886 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 312.648s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 3s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/krssz7ovjwuas

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1559

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1559/display/redirect>

Changes:


------------------------------------------
[...truncated 1.64 MB...]
19/11/17 12:11:51 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/17 12:11:51 INFO DAGScheduler: failed: Set()
19/11/17 12:11:51 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/17 12:11:51 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.4 GB)
19/11/17 12:11:51 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.9 KB, free 13.4 GB)
19/11/17 12:11:51 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:33229 (size: 22.9 KB, free: 13.4 GB)
19/11/17 12:11:51 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/17 12:11:51 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/17 12:11:51 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/17 12:11:51 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 159, localhost, executor driver, partition 1, NODE_LOCAL, 7760 bytes)
19/11/17 12:11:51 INFO Executor: Running task 1.0 in stage 132.0 (TID 159)
19/11/17 12:11:51 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/17 12:11:51 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/17 12:11:51 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestl5OinY/job_fd55eb3e-b44d-422d-b31d-e258737c19ec/MANIFEST
19/11/17 12:11:51 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestl5OinY/job_fd55eb3e-b44d-422d-b31d-e258737c19ec/MANIFEST -> 0 artifacts
19/11/17 12:11:52 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/17 12:11:52 INFO main: Logging handler created.
19/11/17 12:11:52 INFO start: Status HTTP server running at localhost:41505
19/11/17 12:11:52 INFO main: semi_persistent_directory: /tmp
19/11/17 12:11:52 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/17 12:11:52 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573992708.82_4ae0d903-5cb8-4c31-a5e5-92ead92e5883', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/17 12:11:52 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573992708.82', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41679'}
19/11/17 12:11:52 INFO __init__: Creating state cache with size 0
19/11/17 12:11:52 INFO __init__: Creating insecure control channel for localhost:40275.
19/11/17 12:11:52 INFO __init__: Control channel established.
19/11/17 12:11:52 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/17 12:11:52 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/17 12:11:52 INFO create_state_handler: Creating insecure state channel for localhost:38279.
19/11/17 12:11:52 INFO create_state_handler: State channel established.
19/11/17 12:11:52 INFO create_data_channel: Creating client data channel for localhost:40323
19/11/17 12:11:52 INFO GrpcDataService: Beam Fn Data client connected.
19/11/17 12:11:52 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/17 12:11:52 INFO run: No more requests from control plane
19/11/17 12:11:52 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/17 12:11:52 INFO close: Closing all cached grpc data channels.
19/11/17 12:11:52 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/17 12:11:52 INFO close: Closing all cached gRPC state handlers.
19/11/17 12:11:52 INFO run: Done consuming work.
19/11/17 12:11:52 INFO main: Python sdk harness exiting.
19/11/17 12:11:52 INFO GrpcLoggingService: Logging client hanged up.
19/11/17 12:11:52 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/17 12:11:52 INFO Executor: Finished task 1.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/17 12:11:52 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 160, localhost, executor driver, partition 0, PROCESS_LOCAL, 7977 bytes)
19/11/17 12:11:52 INFO Executor: Running task 0.0 in stage 132.0 (TID 160)
19/11/17 12:11:52 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 159) in 913 ms on localhost (executor driver) (1/2)
19/11/17 12:11:52 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestl5OinY/job_fd55eb3e-b44d-422d-b31d-e258737c19ec/MANIFEST
19/11/17 12:11:52 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestl5OinY/job_fd55eb3e-b44d-422d-b31d-e258737c19ec/MANIFEST -> 0 artifacts
19/11/17 12:11:53 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/17 12:11:53 INFO main: Logging handler created.
19/11/17 12:11:53 INFO start: Status HTTP server running at localhost:33767
19/11/17 12:11:53 INFO main: semi_persistent_directory: /tmp
19/11/17 12:11:53 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/17 12:11:53 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573992708.82_4ae0d903-5cb8-4c31-a5e5-92ead92e5883', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/17 12:11:53 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573992708.82', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41679'}
19/11/17 12:11:53 INFO __init__: Creating state cache with size 0
19/11/17 12:11:53 INFO __init__: Creating insecure control channel for localhost:33515.
19/11/17 12:11:53 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/17 12:11:53 INFO __init__: Control channel established.
19/11/17 12:11:53 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/17 12:11:53 INFO create_state_handler: Creating insecure state channel for localhost:40893.
19/11/17 12:11:53 INFO create_state_handler: State channel established.
19/11/17 12:11:53 INFO create_data_channel: Creating client data channel for localhost:34597
19/11/17 12:11:53 INFO GrpcDataService: Beam Fn Data client connected.
19/11/17 12:11:53 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/17 12:11:53 INFO run: No more requests from control plane
19/11/17 12:11:53 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/17 12:11:53 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/17 12:11:53 INFO close: Closing all cached grpc data channels.
19/11/17 12:11:53 INFO close: Closing all cached gRPC state handlers.
19/11/17 12:11:53 INFO run: Done consuming work.
19/11/17 12:11:53 INFO main: Python sdk harness exiting.
19/11/17 12:11:53 INFO GrpcLoggingService: Logging client hanged up.
19/11/17 12:11:53 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/17 12:11:53 INFO Executor: Finished task 0.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/17 12:11:53 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 160) in 820 ms on localhost (executor driver) (2/2)
19/11/17 12:11:53 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/17 12:11:53 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.739 s
19/11/17 12:11:53 INFO DAGScheduler: looking for newly runnable stages
19/11/17 12:11:53 INFO DAGScheduler: running: Set()
19/11/17 12:11:53 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/17 12:11:53 INFO DAGScheduler: failed: Set()
19/11/17 12:11:53 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/17 12:11:53 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.4 GB)
19/11/17 12:11:53 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.3 KB, free 13.4 GB)
19/11/17 12:11:53 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:33229 (size: 12.3 KB, free: 13.4 GB)
19/11/17 12:11:53 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/17 12:11:53 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/17 12:11:53 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/17 12:11:53 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/17 12:11:53 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/17 12:11:53 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/17 12:11:53 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/17 12:11:53 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestl5OinY/job_fd55eb3e-b44d-422d-b31d-e258737c19ec/MANIFEST
19/11/17 12:11:53 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestl5OinY/job_fd55eb3e-b44d-422d-b31d-e258737c19ec/MANIFEST -> 0 artifacts
19/11/17 12:11:53 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/17 12:11:53 INFO main: Logging handler created.
19/11/17 12:11:53 INFO start: Status HTTP server running at localhost:38847
19/11/17 12:11:53 INFO main: semi_persistent_directory: /tmp
19/11/17 12:11:53 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/17 12:11:53 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573992708.82_4ae0d903-5cb8-4c31-a5e5-92ead92e5883', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/17 12:11:53 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573992708.82', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41679'}
19/11/17 12:11:53 INFO __init__: Creating state cache with size 0
19/11/17 12:11:53 INFO __init__: Creating insecure control channel for localhost:39529.
19/11/17 12:11:53 INFO __init__: Control channel established.
19/11/17 12:11:53 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/17 12:11:53 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/17 12:11:53 INFO create_state_handler: Creating insecure state channel for localhost:45327.
19/11/17 12:11:53 INFO create_state_handler: State channel established.
19/11/17 12:11:53 INFO create_data_channel: Creating client data channel for localhost:42163
19/11/17 12:11:53 INFO GrpcDataService: Beam Fn Data client connected.
19/11/17 12:11:53 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/17 12:11:53 INFO run: No more requests from control plane
19/11/17 12:11:53 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/17 12:11:53 INFO close: Closing all cached grpc data channels.
19/11/17 12:11:53 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/17 12:11:53 INFO close: Closing all cached gRPC state handlers.
19/11/17 12:11:53 INFO run: Done consuming work.
19/11/17 12:11:53 INFO main: Python sdk harness exiting.
19/11/17 12:11:53 INFO GrpcLoggingService: Logging client hanged up.
19/11/17 12:11:54 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/17 12:11:54 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/17 12:11:54 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 811 ms on localhost (executor driver) (1/1)
19/11/17 12:11:54 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/17 12:11:54 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.816 s
19/11/17 12:11:54 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.261921 s
19/11/17 12:11:54 INFO SparkPipelineRunner: Job test_windowing_1573992708.82_4ae0d903-5cb8-4c31-a5e5-92ead92e5883 finished.
19/11/17 12:11:54 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/17 12:11:54 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestl5OinY/job_fd55eb3e-b44d-422d-b31d-e258737c19ec/MANIFEST has 0 artifact locations
19/11/17 12:11:54 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestl5OinY/job_fd55eb3e-b44d-422d-b31d-e258737c19ec/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139877592966912)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
# Thread: <Thread(Thread-119, started daemon 139877584574208)>

    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <_MainThread(MainThread, started 139878372407040)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
# Thread: <Thread(wait_until_finish_read, started daemon 139876951389952)>

BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-124, started daemon 139877566478080)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
# Thread: <_MainThread(MainThread, started 139878372407040)>

Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(Thread-119, started daemon 139877584574208)>

  File "apache_beam/runners/portability/portable_runner.py", line 431, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(wait_until_finish_read, started daemon 139877592966912)>
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1573992699.42_c5a3e9dd-1be0-496f-b7d6-cc47b66e2514 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 310.053s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 14s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/edhgv4wcomnoq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1558

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1558/display/redirect>

Changes:


------------------------------------------
[...truncated 1.65 MB...]
19/11/17 06:11:36 INFO main: Logging handler created.
19/11/17 06:11:36 INFO start: Status HTTP server running at localhost:32905
19/11/17 06:11:36 INFO main: semi_persistent_directory: /tmp
19/11/17 06:11:36 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/17 06:11:36 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573971094.49_6f6ccdf9-668a-4e53-b90b-45c7e74ad24b', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/17 06:11:36 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573971094.49', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50447'}
19/11/17 06:11:36 INFO __init__: Creating state cache with size 0
19/11/17 06:11:36 INFO __init__: Creating insecure control channel for localhost:39843.
19/11/17 06:11:36 INFO __init__: Control channel established.
19/11/17 06:11:36 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/17 06:11:36 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/17 06:11:36 INFO create_state_handler: Creating insecure state channel for localhost:45567.
19/11/17 06:11:36 INFO create_state_handler: State channel established.
19/11/17 06:11:36 INFO create_data_channel: Creating client data channel for localhost:34637
19/11/17 06:11:36 INFO GrpcDataService: Beam Fn Data client connected.
19/11/17 06:11:36 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/17 06:11:36 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/17 06:11:36 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/17 06:11:36 INFO run: No more requests from control plane
19/11/17 06:11:36 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/17 06:11:36 INFO close: Closing all cached grpc data channels.
19/11/17 06:11:36 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/17 06:11:36 INFO close: Closing all cached gRPC state handlers.
19/11/17 06:11:36 INFO run: Done consuming work.
19/11/17 06:11:36 INFO main: Python sdk harness exiting.
19/11/17 06:11:36 INFO GrpcLoggingService: Logging client hanged up.
19/11/17 06:11:37 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/17 06:11:37 INFO Executor: Finished task 0.0 in stage 131.0 (TID 158). 12763 bytes result sent to driver
19/11/17 06:11:37 INFO TaskSetManager: Finished task 0.0 in stage 131.0 (TID 158) in 800 ms on localhost (executor driver) (1/1)
19/11/17 06:11:37 INFO TaskSchedulerImpl: Removed TaskSet 131.0, whose tasks have all completed, from pool 
19/11/17 06:11:37 INFO DAGScheduler: ShuffleMapStage 131 (mapToPair at GroupCombineFunctions.java:55) finished in 0.806 s
19/11/17 06:11:37 INFO DAGScheduler: looking for newly runnable stages
19/11/17 06:11:37 INFO DAGScheduler: running: Set()
19/11/17 06:11:37 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/17 06:11:37 INFO DAGScheduler: failed: Set()
19/11/17 06:11:37 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/17 06:11:37 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/17 06:11:37 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.0 KB, free 13.5 GB)
19/11/17 06:11:37 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:41899 (size: 22.0 KB, free: 13.5 GB)
19/11/17 06:11:37 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/17 06:11:37 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/17 06:11:37 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/17 06:11:37 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 159, localhost, executor driver, partition 0, NODE_LOCAL, 7760 bytes)
19/11/17 06:11:37 INFO Executor: Running task 0.0 in stage 132.0 (TID 159)
19/11/17 06:11:37 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/17 06:11:37 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/17 06:11:37 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestRPWZF4/job_881b0732-41b9-46b2-8a92-931507699436/MANIFEST
19/11/17 06:11:37 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestRPWZF4/job_881b0732-41b9-46b2-8a92-931507699436/MANIFEST -> 0 artifacts
19/11/17 06:11:37 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/17 06:11:37 INFO main: Logging handler created.
19/11/17 06:11:37 INFO start: Status HTTP server running at localhost:37073
19/11/17 06:11:37 INFO main: semi_persistent_directory: /tmp
19/11/17 06:11:37 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/17 06:11:37 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573971094.49_6f6ccdf9-668a-4e53-b90b-45c7e74ad24b', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/17 06:11:37 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573971094.49', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50447'}
19/11/17 06:11:37 INFO __init__: Creating state cache with size 0
19/11/17 06:11:37 INFO __init__: Creating insecure control channel for localhost:43467.
19/11/17 06:11:37 INFO __init__: Control channel established.
19/11/17 06:11:37 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/17 06:11:37 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/17 06:11:37 INFO create_state_handler: Creating insecure state channel for localhost:45999.
19/11/17 06:11:37 INFO create_state_handler: State channel established.
19/11/17 06:11:37 INFO create_data_channel: Creating client data channel for localhost:39865
19/11/17 06:11:37 INFO GrpcDataService: Beam Fn Data client connected.
19/11/17 06:11:37 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/17 06:11:37 INFO run: No more requests from control plane
19/11/17 06:11:37 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/17 06:11:37 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/17 06:11:37 INFO close: Closing all cached grpc data channels.
19/11/17 06:11:37 INFO close: Closing all cached gRPC state handlers.
19/11/17 06:11:37 INFO run: Done consuming work.
19/11/17 06:11:37 INFO main: Python sdk harness exiting.
19/11/17 06:11:37 INFO GrpcLoggingService: Logging client hanged up.
19/11/17 06:11:37 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/17 06:11:37 INFO Executor: Finished task 0.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/17 06:11:37 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 160, localhost, executor driver, partition 1, PROCESS_LOCAL, 7977 bytes)
19/11/17 06:11:37 INFO Executor: Running task 1.0 in stage 132.0 (TID 160)
19/11/17 06:11:37 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 159) in 835 ms on localhost (executor driver) (1/2)
19/11/17 06:11:37 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestRPWZF4/job_881b0732-41b9-46b2-8a92-931507699436/MANIFEST
19/11/17 06:11:37 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestRPWZF4/job_881b0732-41b9-46b2-8a92-931507699436/MANIFEST -> 0 artifacts
19/11/17 06:11:38 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/17 06:11:38 INFO main: Logging handler created.
19/11/17 06:11:38 INFO start: Status HTTP server running at localhost:41907
19/11/17 06:11:38 INFO main: semi_persistent_directory: /tmp
19/11/17 06:11:38 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/17 06:11:38 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573971094.49_6f6ccdf9-668a-4e53-b90b-45c7e74ad24b', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/17 06:11:38 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573971094.49', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50447'}
19/11/17 06:11:38 INFO __init__: Creating state cache with size 0
19/11/17 06:11:38 INFO __init__: Creating insecure control channel for localhost:35517.
19/11/17 06:11:38 INFO __init__: Control channel established.
19/11/17 06:11:38 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/17 06:11:38 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/17 06:11:38 INFO create_state_handler: Creating insecure state channel for localhost:36585.
19/11/17 06:11:38 INFO create_state_handler: State channel established.
19/11/17 06:11:38 INFO GrpcDataService: Beam Fn Data client connected.
19/11/17 06:11:38 INFO create_data_channel: Creating client data channel for localhost:43331
19/11/17 06:11:38 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/17 06:11:38 INFO run: No more requests from control plane
19/11/17 06:11:38 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/17 06:11:38 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/17 06:11:38 INFO close: Closing all cached grpc data channels.
19/11/17 06:11:38 INFO close: Closing all cached gRPC state handlers.
19/11/17 06:11:38 INFO run: Done consuming work.
19/11/17 06:11:38 INFO main: Python sdk harness exiting.
19/11/17 06:11:38 INFO GrpcLoggingService: Logging client hanged up.
19/11/17 06:11:38 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/17 06:11:38 INFO Executor: Finished task 1.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/17 06:11:38 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 160) in 802 ms on localhost (executor driver) (2/2)
19/11/17 06:11:38 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/17 06:11:38 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.641 s
19/11/17 06:11:38 INFO DAGScheduler: looking for newly runnable stages
19/11/17 06:11:38 INFO DAGScheduler: running: Set()
19/11/17 06:11:38 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/17 06:11:38 INFO DAGScheduler: failed: Set()
19/11/17 06:11:38 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/17 06:11:38 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/17 06:11:38 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.4 KB, free 13.5 GB)
19/11/17 06:11:38 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:41899 (size: 12.4 KB, free: 13.5 GB)
19/11/17 06:11:38 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/17 06:11:38 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/17 06:11:38 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/17 06:11:38 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/17 06:11:38 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/17 06:11:38 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/17 06:11:38 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/17 06:11:38 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestRPWZF4/job_881b0732-41b9-46b2-8a92-931507699436/MANIFEST
19/11/17 06:11:38 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestRPWZF4/job_881b0732-41b9-46b2-8a92-931507699436/MANIFEST -> 0 artifacts
19/11/17 06:11:39 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/17 06:11:39 INFO main: Logging handler created.
19/11/17 06:11:39 INFO start: Status HTTP server running at localhost:40561
19/11/17 06:11:39 INFO main: semi_persistent_directory: /tmp
19/11/17 06:11:39 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/17 06:11:39 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573971094.49_6f6ccdf9-668a-4e53-b90b-45c7e74ad24b', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/17 06:11:39 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573971094.49', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50447'}
19/11/17 06:11:39 INFO __init__: Creating state cache with size 0
19/11/17 06:11:39 INFO __init__: Creating insecure control channel for localhost:41709.
19/11/17 06:11:39 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/17 06:11:39 INFO __init__: Control channel established.
19/11/17 06:11:39 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/17 06:11:39 INFO create_state_handler: Creating insecure state channel for localhost:40069.
19/11/17 06:11:39 INFO create_state_handler: State channel established.
19/11/17 06:11:39 INFO create_data_channel: Creating client data channel for localhost:43735
19/11/17 06:11:39 INFO GrpcDataService: Beam Fn Data client connected.
19/11/17 06:11:39 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/17 06:11:39 INFO run: No more requests from control plane
19/11/17 06:11:39 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/17 06:11:39 INFO close: Closing all cached grpc data channels.
19/11/17 06:11:39 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/17 06:11:39 INFO close: Closing all cached gRPC state handlers.
19/11/17 06:11:39 INFO run: Done consuming work.
19/11/17 06:11:39 INFO main: Python sdk harness exiting.
19/11/17 06:11:39 INFO GrpcLoggingService: Logging client hanged up.
19/11/17 06:11:39 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/17 06:11:39 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/17 06:11:39 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 773 ms on localhost (executor driver) (1/1)
19/11/17 06:11:39 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/17 06:11:39 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.779 s
19/11/17 06:11:39 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 3.999803 s
19/11/17 06:11:39 INFO SparkPipelineRunner: Job test_windowing_1573971094.49_6f6ccdf9-668a-4e53-b90b-45c7e74ad24b finished.
19/11/17 06:11:39 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/17 06:11:39 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestRPWZF4/job_881b0732-41b9-46b2-8a92-931507699436/MANIFEST has 0 artifact locations
19/11/17 06:11:39 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestRPWZF4/job_881b0732-41b9-46b2-8a92-931507699436/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139809811183360)>

  File "apache_beam/runners/portability/portable_runner.py", line 431, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1573971085.96_5372fd4a-2dbf-46e1-8d72-eb288a11b206 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <Thread(Thread-119, started daemon 139809827968768)>

----------------------------------------------------------------------
Ran 38 tests in 295.585s

# Thread: <_MainThread(MainThread, started 139810607408896)>
FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 5s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/3pxjuqcgwlawc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1557

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1557/display/redirect>

Changes:


------------------------------------------
[...truncated 1.68 MB...]
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2749
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2642
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2748
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2932
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2594
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2571
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2566
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2616
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2620
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2868
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2768
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2816
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2665
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2805
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2858
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2870
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2644
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2750
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2867
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2701
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2937
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2763
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2780
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2856
19/11/17 00:14:17 INFO ContextCleaner: Cleaned shuffle 63
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2700
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2557
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2895
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2621
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2522
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2528
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2837
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2795
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2765
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2884
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2872
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2860
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2652
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2835
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2918
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2812
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2593
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2554
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2875
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2736
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2920
19/11/17 00:14:17 INFO BlockManagerInfo: Removed broadcast_128_piece0 on localhost:37851 in memory (size: 10.8 KB, free: 13.5 GB)
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2562
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2782
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2914
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2790
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2942
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2663
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2911
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2854
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2946
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2685
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2915
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2767
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2680
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2827
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2617
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2904
19/11/17 00:14:17 INFO ContextCleaner: Cleaned shuffle 62
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2686
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2604
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2928
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2813
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2533
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2514
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2589
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2769
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2648
19/11/17 00:14:17 INFO BlockManagerInfo: Removed broadcast_125_piece0 on localhost:37851 in memory (size: 21.9 KB, free: 13.5 GB)
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2890
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2649
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2708
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2552
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2908
19/11/17 00:14:17 INFO BlockManagerInfo: Removed broadcast_122_piece0 on localhost:37851 in memory (size: 18.4 KB, free: 13.5 GB)
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2698
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2673
19/11/17 00:14:17 INFO BlockManagerInfo: Removed broadcast_113_piece0 on localhost:37851 in memory (size: 13.7 KB, free: 13.5 GB)
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2907
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2738
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2580
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2880
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2582
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2603
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2608
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2542
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2569
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2754
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2947
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2572
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2574
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2624
19/11/17 00:14:17 INFO ContextCleaner: Cleaned accumulator 2564
19/11/17 00:14:18 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/17 00:14:18 INFO main: Logging handler created.
19/11/17 00:14:18 INFO start: Status HTTP server running at localhost:44039
19/11/17 00:14:18 INFO main: semi_persistent_directory: /tmp
19/11/17 00:14:18 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/17 00:14:18 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573949652.0_0a3396e3-8387-445f-8963-9179f1c25e99', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/17 00:14:18 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573949652.0', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44591'}
19/11/17 00:14:18 INFO __init__: Creating state cache with size 0
19/11/17 00:14:18 INFO __init__: Creating insecure control channel for localhost:45857.
19/11/17 00:14:18 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/17 00:14:18 INFO __init__: Control channel established.
19/11/17 00:14:18 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/17 00:14:18 INFO create_state_handler: Creating insecure state channel for localhost:38439.
19/11/17 00:14:18 INFO create_state_handler: State channel established.
19/11/17 00:14:18 INFO create_data_channel: Creating client data channel for localhost:41387
19/11/17 00:14:18 INFO GrpcDataService: Beam Fn Data client connected.
19/11/17 00:14:18 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/17 00:14:18 INFO run: No more requests from control plane
19/11/17 00:14:18 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/17 00:14:18 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/17 00:14:18 INFO close: Closing all cached grpc data channels.
19/11/17 00:14:18 INFO close: Closing all cached gRPC state handlers.
19/11/17 00:14:18 INFO run: Done consuming work.
19/11/17 00:14:18 INFO main: Python sdk harness exiting.
19/11/17 00:14:18 INFO GrpcLoggingService: Logging client hanged up.
19/11/17 00:14:18 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/17 00:14:18 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 12013 bytes result sent to driver
19/11/17 00:14:18 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 1065 ms on localhost (executor driver) (1/1)
19/11/17 00:14:18 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/17 00:14:18 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 1.072 s
19/11/17 00:14:18 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 5.149323 s
19/11/17 00:14:18 INFO SparkPipelineRunner: Job test_windowing_1573949652.0_0a3396e3-8387-445f-8963-9179f1c25e99 finished.
19/11/17 00:14:18 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/17 00:14:18 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestJ1mMbD/job_3db4ce16-b754-4944-a35b-8fa6ae6a6b15/MANIFEST has 0 artifact locations
19/11/17 00:14:18 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestJ1mMbD/job_3db4ce16-b754-4944-a35b-8fa6ae6a6b15/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
==================== Timed out after 60 seconds. ====================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)

----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 139843417372416)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_ru# Thread: <Thread(Thread-116, started daemon 139843408979712)>

nner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 139844196792064)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 139843313063680)>

# Thread: <Thread(Thread-122, started daemon 139843321456384)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <_MainThread(MainThread, started 139844196792064)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
# Thread: <Thread(wait_until_finish_read, started daemon 139843417372416)>
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 431, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1573949640.64_aecddceb-9ecd-430d-bbd6-ecf4e792abf2 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.


----------------------------------------------------------------------
Ran 38 tests in 375.772s

FAILED (errors=3, skipped=9)
# Thread: <Thread(Thread-116, started daemon 139843408979712)>

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 40s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/s6jb7cthdzsti

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1556

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1556/display/redirect>

Changes:


------------------------------------------
[...truncated 1.66 MB...]
19/11/16 18:13:23 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/16 18:13:23 INFO DAGScheduler: failed: Set()
19/11/16 18:13:23 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/16 18:13:23 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/16 18:13:23 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.9 KB, free 13.5 GB)
19/11/16 18:13:23 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:41113 (size: 22.9 KB, free: 13.5 GB)
19/11/16 18:13:23 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/16 18:13:23 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/16 18:13:23 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/16 18:13:23 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 159, localhost, executor driver, partition 1, NODE_LOCAL, 7760 bytes)
19/11/16 18:13:23 INFO Executor: Running task 1.0 in stage 132.0 (TID 159)
19/11/16 18:13:23 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/16 18:13:23 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/16 18:13:23 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestzxb0sd/job_93768e1b-7b9a-4ae6-98b1-2f16c9b971ca/MANIFEST
19/11/16 18:13:23 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestzxb0sd/job_93768e1b-7b9a-4ae6-98b1-2f16c9b971ca/MANIFEST -> 0 artifacts
19/11/16 18:13:24 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/16 18:13:24 INFO main: Logging handler created.
19/11/16 18:13:24 INFO start: Status HTTP server running at localhost:35551
19/11/16 18:13:24 INFO main: semi_persistent_directory: /tmp
19/11/16 18:13:24 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/16 18:13:24 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573928000.74_267a4e76-d2f3-4e0e-8576-5eea7e793e2c', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/16 18:13:24 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573928000.74', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57947'}
19/11/16 18:13:24 INFO __init__: Creating state cache with size 0
19/11/16 18:13:24 INFO __init__: Creating insecure control channel for localhost:34335.
19/11/16 18:13:24 INFO __init__: Control channel established.
19/11/16 18:13:24 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/16 18:13:24 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/16 18:13:24 INFO create_state_handler: Creating insecure state channel for localhost:46671.
19/11/16 18:13:24 INFO create_state_handler: State channel established.
19/11/16 18:13:24 INFO create_data_channel: Creating client data channel for localhost:41277
19/11/16 18:13:24 INFO GrpcDataService: Beam Fn Data client connected.
19/11/16 18:13:24 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/16 18:13:24 INFO run: No more requests from control plane
19/11/16 18:13:24 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/16 18:13:24 INFO close: Closing all cached grpc data channels.
19/11/16 18:13:24 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 18:13:24 INFO close: Closing all cached gRPC state handlers.
19/11/16 18:13:24 INFO run: Done consuming work.
19/11/16 18:13:24 INFO main: Python sdk harness exiting.
19/11/16 18:13:24 INFO GrpcLoggingService: Logging client hanged up.
19/11/16 18:13:24 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 18:13:24 INFO Executor: Finished task 1.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/16 18:13:24 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 160, localhost, executor driver, partition 0, PROCESS_LOCAL, 7977 bytes)
19/11/16 18:13:24 INFO Executor: Running task 0.0 in stage 132.0 (TID 160)
19/11/16 18:13:24 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 159) in 1021 ms on localhost (executor driver) (1/2)
19/11/16 18:13:24 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestzxb0sd/job_93768e1b-7b9a-4ae6-98b1-2f16c9b971ca/MANIFEST
19/11/16 18:13:24 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestzxb0sd/job_93768e1b-7b9a-4ae6-98b1-2f16c9b971ca/MANIFEST -> 0 artifacts
19/11/16 18:13:25 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/16 18:13:25 INFO main: Logging handler created.
19/11/16 18:13:25 INFO start: Status HTTP server running at localhost:40387
19/11/16 18:13:25 INFO main: semi_persistent_directory: /tmp
19/11/16 18:13:25 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/16 18:13:25 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573928000.74_267a4e76-d2f3-4e0e-8576-5eea7e793e2c', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/16 18:13:25 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573928000.74', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57947'}
19/11/16 18:13:25 INFO __init__: Creating state cache with size 0
19/11/16 18:13:25 INFO __init__: Creating insecure control channel for localhost:44263.
19/11/16 18:13:25 INFO __init__: Control channel established.
19/11/16 18:13:25 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/16 18:13:25 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/16 18:13:25 INFO create_state_handler: Creating insecure state channel for localhost:33627.
19/11/16 18:13:25 INFO create_state_handler: State channel established.
19/11/16 18:13:25 INFO create_data_channel: Creating client data channel for localhost:38525
19/11/16 18:13:25 INFO GrpcDataService: Beam Fn Data client connected.
19/11/16 18:13:25 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/16 18:13:25 INFO run: No more requests from control plane
19/11/16 18:13:25 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/16 18:13:25 INFO close: Closing all cached grpc data channels.
19/11/16 18:13:25 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 18:13:25 INFO close: Closing all cached gRPC state handlers.
19/11/16 18:13:25 INFO run: Done consuming work.
19/11/16 18:13:25 INFO main: Python sdk harness exiting.
19/11/16 18:13:25 INFO GrpcLoggingService: Logging client hanged up.
19/11/16 18:13:25 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 18:13:25 INFO Executor: Finished task 0.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/16 18:13:25 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 160) in 907 ms on localhost (executor driver) (2/2)
19/11/16 18:13:25 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/16 18:13:25 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.935 s
19/11/16 18:13:25 INFO DAGScheduler: looking for newly runnable stages
19/11/16 18:13:25 INFO DAGScheduler: running: Set()
19/11/16 18:13:25 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/16 18:13:25 INFO DAGScheduler: failed: Set()
19/11/16 18:13:25 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/16 18:13:25 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/16 18:13:25 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.4 KB, free 13.5 GB)
19/11/16 18:13:25 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:41113 (size: 12.4 KB, free: 13.5 GB)
19/11/16 18:13:25 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/16 18:13:25 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/16 18:13:25 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/16 18:13:25 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/16 18:13:25 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/16 18:13:25 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/16 18:13:25 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/16 18:13:25 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestzxb0sd/job_93768e1b-7b9a-4ae6-98b1-2f16c9b971ca/MANIFEST
19/11/16 18:13:25 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestzxb0sd/job_93768e1b-7b9a-4ae6-98b1-2f16c9b971ca/MANIFEST -> 0 artifacts
19/11/16 18:13:26 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/16 18:13:26 INFO main: Logging handler created.
19/11/16 18:13:26 INFO start: Status HTTP server running at localhost:39929
19/11/16 18:13:26 INFO main: semi_persistent_directory: /tmp
19/11/16 18:13:26 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/16 18:13:26 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573928000.74_267a4e76-d2f3-4e0e-8576-5eea7e793e2c', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/16 18:13:26 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573928000.74', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57947'}
19/11/16 18:13:26 INFO __init__: Creating state cache with size 0
19/11/16 18:13:26 INFO __init__: Creating insecure control channel for localhost:33565.
19/11/16 18:13:26 INFO __init__: Control channel established.
19/11/16 18:13:26 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/16 18:13:26 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/16 18:13:26 INFO create_state_handler: Creating insecure state channel for localhost:32875.
19/11/16 18:13:26 INFO create_state_handler: State channel established.
19/11/16 18:13:26 INFO create_data_channel: Creating client data channel for localhost:34255
19/11/16 18:13:26 INFO GrpcDataService: Beam Fn Data client connected.
19/11/16 18:13:26 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/16 18:13:26 INFO run: No more requests from control plane
19/11/16 18:13:26 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/16 18:13:26 INFO close: Closing all cached grpc data channels.
19/11/16 18:13:26 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 18:13:26 INFO close: Closing all cached gRPC state handlers.
19/11/16 18:13:26 INFO run: Done consuming work.
19/11/16 18:13:26 INFO main: Python sdk harness exiting.
19/11/16 18:13:26 INFO GrpcLoggingService: Logging client hanged up.
19/11/16 18:13:26 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 18:13:26 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/16 18:13:26 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 860 ms on localhost (executor driver) (1/1)
19/11/16 18:13:26 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/16 18:13:26 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.866 s
19/11/16 18:13:26 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.751208 s
19/11/16 18:13:26 INFO SparkPipelineRunner: Job test_windowing_1573928000.74_267a4e76-d2f3-4e0e-8576-5eea7e793e2c finished.
19/11/16 18:13:26 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/16 18:13:26 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestzxb0sd/job_93768e1b-7b9a-4ae6-98b1-2f16c9b971ca/MANIFEST has 0 artifact locations
19/11/16 18:13:26 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestzxb0sd/job_93768e1b-7b9a-4ae6-98b1-2f16c9b971ca/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
==================== Timed out after 60 seconds. ====================

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 140274017752832)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-120, started daemon 140274026145536)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
# Thread: <_MainThread(MainThread, started 140275156637440)>
==================== Timed out after 60 seconds. ====================

    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140274000967424)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-126, started daemon 140274009360128)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
# Thread: <Thread(Thread-120, started daemon 140274026145536)>

# Thread: <Thread(wait_until_finish_read, started daemon 140274017752832)>

    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140275156637440)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 431, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1573927990.45_ef19d634-a9e8-48df-b54e-da0d9dbb5654 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 319.989s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 2s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/b7twef2lwwj6e

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1555

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1555/display/redirect>

Changes:


------------------------------------------
[...truncated 1.66 MB...]
19/11/16 12:09:33 INFO TaskSchedulerImpl: Removed TaskSet 131.0, whose tasks have all completed, from pool 
19/11/16 12:09:33 INFO DAGScheduler: ShuffleMapStage 131 (mapToPair at GroupCombineFunctions.java:55) finished in 0.905 s
19/11/16 12:09:33 INFO DAGScheduler: looking for newly runnable stages
19/11/16 12:09:33 INFO DAGScheduler: running: Set()
19/11/16 12:09:33 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/16 12:09:33 INFO DAGScheduler: failed: Set()
19/11/16 12:09:33 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/16 12:09:33 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/16 12:09:33 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.9 KB, free 13.5 GB)
19/11/16 12:09:33 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:46855 (size: 22.9 KB, free: 13.5 GB)
19/11/16 12:09:33 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/16 12:09:33 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/16 12:09:33 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/16 12:09:33 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 159, localhost, executor driver, partition 1, NODE_LOCAL, 7760 bytes)
19/11/16 12:09:33 INFO Executor: Running task 1.0 in stage 132.0 (TID 159)
19/11/16 12:09:33 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/16 12:09:33 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/16 12:09:33 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestpBskT5/job_59ce4357-ce20-4572-83f5-652cc751bbdf/MANIFEST
19/11/16 12:09:33 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestpBskT5/job_59ce4357-ce20-4572-83f5-652cc751bbdf/MANIFEST -> 0 artifacts
19/11/16 12:09:33 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/16 12:09:33 INFO main: Logging handler created.
19/11/16 12:09:33 INFO start: Status HTTP server running at localhost:35131
19/11/16 12:09:33 INFO main: semi_persistent_directory: /tmp
19/11/16 12:09:33 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/16 12:09:33 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573906170.45_a128d4bd-1109-4a23-b515-2358f03f161a', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/16 12:09:33 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573906170.45', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49147'}
19/11/16 12:09:33 INFO __init__: Creating state cache with size 0
19/11/16 12:09:33 INFO __init__: Creating insecure control channel for localhost:42109.
19/11/16 12:09:33 INFO __init__: Control channel established.
19/11/16 12:09:33 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/16 12:09:33 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/16 12:09:33 INFO create_state_handler: Creating insecure state channel for localhost:36527.
19/11/16 12:09:33 INFO create_state_handler: State channel established.
19/11/16 12:09:33 INFO create_data_channel: Creating client data channel for localhost:42079
19/11/16 12:09:33 INFO GrpcDataService: Beam Fn Data client connected.
19/11/16 12:09:33 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/16 12:09:34 INFO run: No more requests from control plane
19/11/16 12:09:34 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/16 12:09:34 INFO close: Closing all cached grpc data channels.
19/11/16 12:09:34 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 12:09:34 INFO close: Closing all cached gRPC state handlers.
19/11/16 12:09:34 INFO run: Done consuming work.
19/11/16 12:09:34 INFO main: Python sdk harness exiting.
19/11/16 12:09:34 INFO GrpcLoggingService: Logging client hanged up.
19/11/16 12:09:34 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 12:09:34 INFO Executor: Finished task 1.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/16 12:09:34 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 160, localhost, executor driver, partition 0, PROCESS_LOCAL, 7977 bytes)
19/11/16 12:09:34 INFO Executor: Running task 0.0 in stage 132.0 (TID 160)
19/11/16 12:09:34 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 159) in 898 ms on localhost (executor driver) (1/2)
19/11/16 12:09:34 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestpBskT5/job_59ce4357-ce20-4572-83f5-652cc751bbdf/MANIFEST
19/11/16 12:09:34 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestpBskT5/job_59ce4357-ce20-4572-83f5-652cc751bbdf/MANIFEST -> 0 artifacts
19/11/16 12:09:34 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/16 12:09:34 INFO main: Logging handler created.
19/11/16 12:09:34 INFO start: Status HTTP server running at localhost:38899
19/11/16 12:09:34 INFO main: semi_persistent_directory: /tmp
19/11/16 12:09:34 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/16 12:09:34 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573906170.45_a128d4bd-1109-4a23-b515-2358f03f161a', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/16 12:09:34 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573906170.45', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49147'}
19/11/16 12:09:34 INFO __init__: Creating state cache with size 0
19/11/16 12:09:34 INFO __init__: Creating insecure control channel for localhost:34701.
19/11/16 12:09:34 INFO __init__: Control channel established.
19/11/16 12:09:34 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/16 12:09:34 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/16 12:09:34 INFO create_state_handler: Creating insecure state channel for localhost:42959.
19/11/16 12:09:34 INFO create_state_handler: State channel established.
19/11/16 12:09:34 INFO create_data_channel: Creating client data channel for localhost:44923
19/11/16 12:09:34 INFO GrpcDataService: Beam Fn Data client connected.
19/11/16 12:09:34 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/16 12:09:34 INFO run: No more requests from control plane
19/11/16 12:09:34 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/16 12:09:34 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 12:09:34 INFO close: Closing all cached grpc data channels.
19/11/16 12:09:34 INFO close: Closing all cached gRPC state handlers.
19/11/16 12:09:34 INFO run: Done consuming work.
19/11/16 12:09:34 INFO main: Python sdk harness exiting.
19/11/16 12:09:34 INFO GrpcLoggingService: Logging client hanged up.
19/11/16 12:09:34 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 12:09:34 INFO Executor: Finished task 0.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/16 12:09:34 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 160) in 843 ms on localhost (executor driver) (2/2)
19/11/16 12:09:34 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/16 12:09:34 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.747 s
19/11/16 12:09:34 INFO DAGScheduler: looking for newly runnable stages
19/11/16 12:09:34 INFO DAGScheduler: running: Set()
19/11/16 12:09:34 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/16 12:09:34 INFO DAGScheduler: failed: Set()
19/11/16 12:09:34 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/16 12:09:34 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/16 12:09:34 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.3 KB, free 13.5 GB)
19/11/16 12:09:34 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:46855 (size: 12.3 KB, free: 13.5 GB)
19/11/16 12:09:34 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/16 12:09:34 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/16 12:09:34 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/16 12:09:34 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/16 12:09:34 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/16 12:09:34 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/16 12:09:34 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/16 12:09:34 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestpBskT5/job_59ce4357-ce20-4572-83f5-652cc751bbdf/MANIFEST
19/11/16 12:09:34 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestpBskT5/job_59ce4357-ce20-4572-83f5-652cc751bbdf/MANIFEST -> 0 artifacts
19/11/16 12:09:35 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/16 12:09:35 INFO main: Logging handler created.
19/11/16 12:09:35 INFO start: Status HTTP server running at localhost:35975
19/11/16 12:09:35 INFO main: semi_persistent_directory: /tmp
19/11/16 12:09:35 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/16 12:09:35 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573906170.45_a128d4bd-1109-4a23-b515-2358f03f161a', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/16 12:09:35 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573906170.45', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49147'}
19/11/16 12:09:35 INFO __init__: Creating state cache with size 0
19/11/16 12:09:35 INFO __init__: Creating insecure control channel for localhost:40387.
19/11/16 12:09:35 INFO __init__: Control channel established.
19/11/16 12:09:35 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/16 12:09:35 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/16 12:09:35 INFO create_state_handler: Creating insecure state channel for localhost:45939.
19/11/16 12:09:35 INFO create_state_handler: State channel established.
19/11/16 12:09:35 INFO create_data_channel: Creating client data channel for localhost:33593
19/11/16 12:09:35 INFO GrpcDataService: Beam Fn Data client connected.
19/11/16 12:09:35 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/16 12:09:35 INFO run: No more requests from control plane
19/11/16 12:09:35 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/16 12:09:35 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 12:09:35 INFO close: Closing all cached grpc data channels.
19/11/16 12:09:35 INFO close: Closing all cached gRPC state handlers.
19/11/16 12:09:35 INFO run: Done consuming work.
19/11/16 12:09:35 INFO main: Python sdk harness exiting.
19/11/16 12:09:35 INFO GrpcLoggingService: Logging client hanged up.
19/11/16 12:09:35 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 12:09:35 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/16 12:09:35 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 857 ms on localhost (executor driver) (1/1)
19/11/16 12:09:35 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/16 12:09:35 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.863 s
19/11/16 12:09:35 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.340407 s
19/11/16 12:09:35 INFO SparkPipelineRunner: Job test_windowing_1573906170.45_a128d4bd-1109-4a23-b515-2358f03f161a finished.
19/11/16 12:09:35 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/16 12:09:35 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestpBskT5/job_59ce4357-ce20-4572-83f5-652cc751bbdf/MANIFEST has 0 artifact locations
19/11/16 12:09:35 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestpBskT5/job_59ce4357-ce20-4572-83f5-652cc751bbdf/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139774905534208)>

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-120, started daemon 139774897141504)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <_MainThread(MainThread, started 139775684974336)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_ru# Thread: <Thread(wait_until_finish_read, started daemon 139774409045760)>

nner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-126, started daemon 139774400653056)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 139775684974336)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 431, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1573906161.33_e81e2655-60f6-447b-98ab-650cf8ee4637 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 293.970s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 21s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/aiygl7votcrhm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1554

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1554/display/redirect>

Changes:


------------------------------------------
[...truncated 1.67 MB...]
19/11/16 06:14:57 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/16 06:14:57 INFO DAGScheduler: failed: Set()
19/11/16 06:14:57 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/16 06:14:57 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/16 06:14:57 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.9 KB, free 13.5 GB)
19/11/16 06:14:57 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:46329 (size: 22.9 KB, free: 13.5 GB)
19/11/16 06:14:57 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/16 06:14:57 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/16 06:14:57 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/16 06:14:57 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 159, localhost, executor driver, partition 1, NODE_LOCAL, 7760 bytes)
19/11/16 06:14:57 INFO Executor: Running task 1.0 in stage 132.0 (TID 159)
19/11/16 06:14:57 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/16 06:14:57 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/16 06:14:57 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestQFcKQ_/job_fc1b2abd-1bf7-4062-9f1b-33cf8203b69b/MANIFEST
19/11/16 06:14:57 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestQFcKQ_/job_fc1b2abd-1bf7-4062-9f1b-33cf8203b69b/MANIFEST -> 0 artifacts
19/11/16 06:14:58 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/16 06:14:58 INFO main: Logging handler created.
19/11/16 06:14:58 INFO start: Status HTTP server running at localhost:42003
19/11/16 06:14:58 INFO main: semi_persistent_directory: /tmp
19/11/16 06:14:58 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/16 06:14:58 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573884894.3_3fcda3ed-1ae5-4f4f-b333-a4fdc891f1c8', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/16 06:14:58 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573884894.3', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37099'}
19/11/16 06:14:58 INFO __init__: Creating state cache with size 0
19/11/16 06:14:58 INFO __init__: Creating insecure control channel for localhost:44583.
19/11/16 06:14:58 INFO __init__: Control channel established.
19/11/16 06:14:58 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/16 06:14:58 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/16 06:14:58 INFO create_state_handler: Creating insecure state channel for localhost:35475.
19/11/16 06:14:58 INFO create_state_handler: State channel established.
19/11/16 06:14:58 INFO create_data_channel: Creating client data channel for localhost:41953
19/11/16 06:14:58 INFO GrpcDataService: Beam Fn Data client connected.
19/11/16 06:14:58 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/16 06:14:58 INFO run: No more requests from control plane
19/11/16 06:14:58 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/16 06:14:58 INFO close: Closing all cached grpc data channels.
19/11/16 06:14:58 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 06:14:58 INFO close: Closing all cached gRPC state handlers.
19/11/16 06:14:58 INFO run: Done consuming work.
19/11/16 06:14:58 INFO main: Python sdk harness exiting.
19/11/16 06:14:58 INFO GrpcLoggingService: Logging client hanged up.
19/11/16 06:14:58 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 06:14:58 INFO Executor: Finished task 1.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/16 06:14:58 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 160, localhost, executor driver, partition 0, PROCESS_LOCAL, 7977 bytes)
19/11/16 06:14:58 INFO Executor: Running task 0.0 in stage 132.0 (TID 160)
19/11/16 06:14:58 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 159) in 1132 ms on localhost (executor driver) (1/2)
19/11/16 06:14:59 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestQFcKQ_/job_fc1b2abd-1bf7-4062-9f1b-33cf8203b69b/MANIFEST
19/11/16 06:14:59 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestQFcKQ_/job_fc1b2abd-1bf7-4062-9f1b-33cf8203b69b/MANIFEST -> 0 artifacts
19/11/16 06:14:59 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/16 06:14:59 INFO main: Logging handler created.
19/11/16 06:14:59 INFO start: Status HTTP server running at localhost:38619
19/11/16 06:14:59 INFO main: semi_persistent_directory: /tmp
19/11/16 06:14:59 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/16 06:14:59 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573884894.3_3fcda3ed-1ae5-4f4f-b333-a4fdc891f1c8', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/16 06:14:59 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573884894.3', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37099'}
19/11/16 06:14:59 INFO __init__: Creating state cache with size 0
19/11/16 06:14:59 INFO __init__: Creating insecure control channel for localhost:36875.
19/11/16 06:14:59 INFO __init__: Control channel established.
19/11/16 06:14:59 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/16 06:14:59 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/16 06:14:59 INFO create_state_handler: Creating insecure state channel for localhost:45069.
19/11/16 06:14:59 INFO create_state_handler: State channel established.
19/11/16 06:14:59 INFO create_data_channel: Creating client data channel for localhost:39701
19/11/16 06:14:59 INFO GrpcDataService: Beam Fn Data client connected.
19/11/16 06:14:59 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/16 06:14:59 INFO run: No more requests from control plane
19/11/16 06:14:59 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/16 06:14:59 INFO close: Closing all cached grpc data channels.
19/11/16 06:14:59 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 06:14:59 INFO close: Closing all cached gRPC state handlers.
19/11/16 06:14:59 INFO run: Done consuming work.
19/11/16 06:14:59 INFO main: Python sdk harness exiting.
19/11/16 06:14:59 INFO GrpcLoggingService: Logging client hanged up.
19/11/16 06:15:00 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 06:15:00 INFO Executor: Finished task 0.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/16 06:15:00 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 160) in 1115 ms on localhost (executor driver) (2/2)
19/11/16 06:15:00 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/16 06:15:00 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 2.253 s
19/11/16 06:15:00 INFO DAGScheduler: looking for newly runnable stages
19/11/16 06:15:00 INFO DAGScheduler: running: Set()
19/11/16 06:15:00 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/16 06:15:00 INFO DAGScheduler: failed: Set()
19/11/16 06:15:00 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/16 06:15:00 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/16 06:15:00 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.4 KB, free 13.5 GB)
19/11/16 06:15:00 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:46329 (size: 12.4 KB, free: 13.5 GB)
19/11/16 06:15:00 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/16 06:15:00 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/16 06:15:00 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/16 06:15:00 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/16 06:15:00 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/16 06:15:00 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/16 06:15:00 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
19/11/16 06:15:00 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestQFcKQ_/job_fc1b2abd-1bf7-4062-9f1b-33cf8203b69b/MANIFEST
19/11/16 06:15:00 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestQFcKQ_/job_fc1b2abd-1bf7-4062-9f1b-33cf8203b69b/MANIFEST -> 0 artifacts
19/11/16 06:15:01 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/16 06:15:01 INFO main: Logging handler created.
19/11/16 06:15:01 INFO start: Status HTTP server running at localhost:36053
19/11/16 06:15:01 INFO main: semi_persistent_directory: /tmp
19/11/16 06:15:01 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/16 06:15:01 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573884894.3_3fcda3ed-1ae5-4f4f-b333-a4fdc891f1c8', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/16 06:15:01 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573884894.3', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37099'}
19/11/16 06:15:01 INFO __init__: Creating state cache with size 0
19/11/16 06:15:01 INFO __init__: Creating insecure control channel for localhost:36869.
19/11/16 06:15:01 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/16 06:15:01 INFO __init__: Control channel established.
19/11/16 06:15:01 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/16 06:15:01 INFO create_state_handler: Creating insecure state channel for localhost:40423.
19/11/16 06:15:01 INFO create_state_handler: State channel established.
19/11/16 06:15:01 INFO create_data_channel: Creating client data channel for localhost:45993
19/11/16 06:15:01 INFO GrpcDataService: Beam Fn Data client connected.
19/11/16 06:15:01 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/16 06:15:01 INFO run: No more requests from control plane
19/11/16 06:15:01 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/16 06:15:01 INFO close: Closing all cached grpc data channels.
19/11/16 06:15:01 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 06:15:01 INFO close: Closing all cached gRPC state handlers.
19/11/16 06:15:01 INFO run: Done consuming work.
19/11/16 06:15:01 INFO main: Python sdk harness exiting.
19/11/16 06:15:01 INFO GrpcLoggingService: Logging client hanged up.
19/11/16 06:15:01 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 06:15:01 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/16 06:15:01 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 1204 ms on localhost (executor driver) (1/1)
19/11/16 06:15:01 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/16 06:15:01 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 1.213 s
19/11/16 06:15:01 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 5.651127 s
19/11/16 06:15:01 INFO SparkPipelineRunner: Job test_windowing_1573884894.3_3fcda3ed-1ae5-4f4f-b333-a4fdc891f1c8 finished.
19/11/16 06:15:01 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/16 06:15:01 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestQFcKQ_/job_fc1b2abd-1bf7-4062-9f1b-33cf8203b69b/MANIFEST has 0 artifact locations
19/11/16 06:15:01 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestQFcKQ_/job_fc1b2abd-1bf7-4062-9f1b-33cf8203b69b/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
==================== Timed out after 60 seconds. ====================
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next

    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140254640076544)>

# Thread: <Thread(Thread-119, started daemon 140254631683840)>

# Thread: <_MainThread(MainThread, started 140255771649792)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140254614898432)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
# Thread: <Thread(Thread-125, started daemon 140254623291136)>

    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 431, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1573884882.62_20498edd-616d-4264-96e3-608f0f9a3b5c failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 356.895s

FAILED (errors=3, skipped=9)
# Thread: <Thread(Thread-119, started daemon 140254631683840)>

# Thread: <_MainThread(MainThread, started 140255771649792)>

# Thread: <Thread(wait_until_finish_read, started daemon 140254640076544)>

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 44s
59 actionable tasks: 47 executed, 12 from cache

Publishing build scan...
https://gradle.com/s/vh2ez5r3xl6zk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1553

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1553/display/redirect?page=changes>

Changes:

[github] [BEAM-8661] Moving runners to have per-module logger (#10097)


------------------------------------------
[...truncated 1.67 MB...]
19/11/16 02:39:41 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/16 02:39:41 INFO DAGScheduler: failed: Set()
19/11/16 02:39:41 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/16 02:39:41 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/16 02:39:41 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.0 KB, free 13.5 GB)
19/11/16 02:39:41 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:44597 (size: 22.0 KB, free: 13.5 GB)
19/11/16 02:39:41 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/16 02:39:41 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/16 02:39:41 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/16 02:39:41 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 159, localhost, executor driver, partition 0, NODE_LOCAL, 7760 bytes)
19/11/16 02:39:41 INFO Executor: Running task 0.0 in stage 132.0 (TID 159)
19/11/16 02:39:41 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/16 02:39:41 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
19/11/16 02:39:41 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest2WZQzP/job_7e484210-1f83-4f4a-866b-d2459ff79402/MANIFEST
19/11/16 02:39:41 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest2WZQzP/job_7e484210-1f83-4f4a-866b-d2459ff79402/MANIFEST -> 0 artifacts
19/11/16 02:39:41 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/16 02:39:41 INFO main: Logging handler created.
19/11/16 02:39:41 INFO start: Status HTTP server running at localhost:44843
19/11/16 02:39:41 INFO main: semi_persistent_directory: /tmp
19/11/16 02:39:41 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/16 02:39:41 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573871978.29_5dfa2ad2-d235-4556-9917-a7686fc25c00', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/16 02:39:41 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573871978.29', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44167'}
19/11/16 02:39:41 INFO __init__: Creating state cache with size 0
19/11/16 02:39:41 INFO __init__: Creating insecure control channel for localhost:40895.
19/11/16 02:39:41 INFO __init__: Control channel established.
19/11/16 02:39:41 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/16 02:39:41 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/16 02:39:41 INFO create_state_handler: Creating insecure state channel for localhost:36499.
19/11/16 02:39:41 INFO create_state_handler: State channel established.
19/11/16 02:39:41 INFO create_data_channel: Creating client data channel for localhost:46501
19/11/16 02:39:41 INFO GrpcDataService: Beam Fn Data client connected.
19/11/16 02:39:42 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/16 02:39:42 INFO run: No more requests from control plane
19/11/16 02:39:42 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/16 02:39:42 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 02:39:42 INFO close: Closing all cached grpc data channels.
19/11/16 02:39:42 INFO close: Closing all cached gRPC state handlers.
19/11/16 02:39:42 INFO run: Done consuming work.
19/11/16 02:39:42 INFO main: Python sdk harness exiting.
19/11/16 02:39:42 INFO GrpcLoggingService: Logging client hanged up.
19/11/16 02:39:42 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 02:39:42 INFO Executor: Finished task 0.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/16 02:39:42 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 160, localhost, executor driver, partition 1, PROCESS_LOCAL, 7977 bytes)
19/11/16 02:39:42 INFO Executor: Running task 1.0 in stage 132.0 (TID 160)
19/11/16 02:39:42 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 159) in 950 ms on localhost (executor driver) (1/2)
19/11/16 02:39:42 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest2WZQzP/job_7e484210-1f83-4f4a-866b-d2459ff79402/MANIFEST
19/11/16 02:39:42 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest2WZQzP/job_7e484210-1f83-4f4a-866b-d2459ff79402/MANIFEST -> 0 artifacts
19/11/16 02:39:42 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/16 02:39:42 INFO main: Logging handler created.
19/11/16 02:39:42 INFO start: Status HTTP server running at localhost:44451
19/11/16 02:39:42 INFO main: semi_persistent_directory: /tmp
19/11/16 02:39:42 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/16 02:39:42 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573871978.29_5dfa2ad2-d235-4556-9917-a7686fc25c00', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/16 02:39:42 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573871978.29', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44167'}
19/11/16 02:39:42 INFO __init__: Creating state cache with size 0
19/11/16 02:39:42 INFO __init__: Creating insecure control channel for localhost:46033.
19/11/16 02:39:42 INFO __init__: Control channel established.
19/11/16 02:39:42 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/16 02:39:42 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/16 02:39:42 INFO create_state_handler: Creating insecure state channel for localhost:44933.
19/11/16 02:39:42 INFO create_state_handler: State channel established.
19/11/16 02:39:42 INFO create_data_channel: Creating client data channel for localhost:36211
19/11/16 02:39:42 INFO GrpcDataService: Beam Fn Data client connected.
19/11/16 02:39:42 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/16 02:39:42 INFO run: No more requests from control plane
19/11/16 02:39:42 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/16 02:39:42 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 02:39:42 INFO close: Closing all cached grpc data channels.
19/11/16 02:39:42 INFO close: Closing all cached gRPC state handlers.
19/11/16 02:39:42 INFO run: Done consuming work.
19/11/16 02:39:42 INFO main: Python sdk harness exiting.
19/11/16 02:39:42 INFO GrpcLoggingService: Logging client hanged up.
19/11/16 02:39:42 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 02:39:42 INFO Executor: Finished task 1.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/16 02:39:42 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 160) in 870 ms on localhost (executor driver) (2/2)
19/11/16 02:39:42 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/16 02:39:42 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.827 s
19/11/16 02:39:42 INFO DAGScheduler: looking for newly runnable stages
19/11/16 02:39:42 INFO DAGScheduler: running: Set()
19/11/16 02:39:42 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/16 02:39:42 INFO DAGScheduler: failed: Set()
19/11/16 02:39:42 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/16 02:39:42 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/16 02:39:42 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.3 KB, free 13.5 GB)
19/11/16 02:39:42 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:44597 (size: 12.3 KB, free: 13.5 GB)
19/11/16 02:39:42 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/16 02:39:43 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/16 02:39:43 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/16 02:39:43 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/16 02:39:43 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/16 02:39:43 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/16 02:39:43 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/16 02:39:43 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest2WZQzP/job_7e484210-1f83-4f4a-866b-d2459ff79402/MANIFEST
19/11/16 02:39:43 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest2WZQzP/job_7e484210-1f83-4f4a-866b-d2459ff79402/MANIFEST -> 0 artifacts
19/11/16 02:39:43 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/16 02:39:43 INFO main: Logging handler created.
19/11/16 02:39:43 INFO start: Status HTTP server running at localhost:43991
19/11/16 02:39:43 INFO main: semi_persistent_directory: /tmp
19/11/16 02:39:43 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/16 02:39:43 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573871978.29_5dfa2ad2-d235-4556-9917-a7686fc25c00', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/16 02:39:43 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573871978.29', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44167'}
19/11/16 02:39:43 INFO __init__: Creating state cache with size 0
19/11/16 02:39:43 INFO __init__: Creating insecure control channel for localhost:40971.
19/11/16 02:39:43 INFO __init__: Control channel established.
19/11/16 02:39:43 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/16 02:39:43 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/16 02:39:43 INFO create_state_handler: Creating insecure state channel for localhost:39423.
19/11/16 02:39:43 INFO create_state_handler: State channel established.
19/11/16 02:39:43 INFO create_data_channel: Creating client data channel for localhost:36427
19/11/16 02:39:43 INFO GrpcDataService: Beam Fn Data client connected.
19/11/16 02:39:43 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/16 02:39:43 INFO run: No more requests from control plane
19/11/16 02:39:43 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/16 02:39:43 INFO close: Closing all cached grpc data channels.
19/11/16 02:39:43 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 02:39:43 INFO close: Closing all cached gRPC state handlers.
19/11/16 02:39:43 INFO run: Done consuming work.
19/11/16 02:39:43 INFO main: Python sdk harness exiting.
19/11/16 02:39:43 INFO GrpcLoggingService: Logging client hanged up.
19/11/16 02:39:43 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 02:39:43 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/16 02:39:43 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 870 ms on localhost (executor driver) (1/1)
19/11/16 02:39:43 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/16 02:39:43 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.877 s
19/11/16 02:39:43 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.513910 s
19/11/16 02:39:43 INFO SparkPipelineRunner: Job test_windowing_1573871978.29_5dfa2ad2-d235-4556-9917-a7686fc25c00 finished.
19/11/16 02:39:43 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/16 02:39:43 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktest2WZQzP/job_7e484210-1f83-4f4a-866b-d2459ff79402/MANIFEST has 0 artifact locations
19/11/16 02:39:43 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktest2WZQzP/job_7e484210-1f83-4f4a-866b-d2459ff79402/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
==================== Timed out after 60 seconds. ====================

----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_ru# Thread: <Thread(wait_until_finish_read, started daemon 140111536723712)>

nner.py", line 421, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-117, started daemon 140111528331008)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140112323286784)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140111509448448)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(Thread-123, started daemon 140111517841152)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140112323286784)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-117, started daemon 140111528331008)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 140111536723712)>
  File "apache_beam/runners/portability/portable_runner.py", line 431, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1573871968.79_92c4ed58-8711-4e73-bb43-cfb0fddea25d failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 311.369s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 42s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/rl65zv6d5zzxi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1552

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1552/display/redirect?page=changes>

Changes:

[thw] [BEAM-8670] Manage environment parallelism in DefaultJobBundleFactory


------------------------------------------
[...truncated 1.65 MB...]
19/11/16 01:30:24 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/16 01:30:24 INFO DAGScheduler: failed: Set()
19/11/16 01:30:24 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/16 01:30:24 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/16 01:30:24 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.9 KB, free 13.5 GB)
19/11/16 01:30:24 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:34037 (size: 22.9 KB, free: 13.5 GB)
19/11/16 01:30:24 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/16 01:30:24 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/16 01:30:24 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/16 01:30:24 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 159, localhost, executor driver, partition 1, NODE_LOCAL, 7760 bytes)
19/11/16 01:30:24 INFO Executor: Running task 1.0 in stage 132.0 (TID 159)
19/11/16 01:30:24 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/16 01:30:24 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/16 01:30:24 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestfMusLv/job_06acddf1-3f88-4f92-b7f5-9cdcebcd1411/MANIFEST
19/11/16 01:30:24 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestfMusLv/job_06acddf1-3f88-4f92-b7f5-9cdcebcd1411/MANIFEST -> 0 artifacts
19/11/16 01:30:25 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/16 01:30:25 INFO main: Logging handler created.
19/11/16 01:30:25 INFO start: Status HTTP server running at localhost:45975
19/11/16 01:30:25 INFO main: semi_persistent_directory: /tmp
19/11/16 01:30:25 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/16 01:30:25 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573867821.76_d29add25-6bc9-4aa8-a01d-c144c395cf40', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/16 01:30:25 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573867821.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60259'}
19/11/16 01:30:25 INFO __init__: Creating state cache with size 0
19/11/16 01:30:25 INFO __init__: Creating insecure control channel for localhost:35985.
19/11/16 01:30:25 INFO __init__: Control channel established.
19/11/16 01:30:25 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/16 01:30:25 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/16 01:30:25 INFO create_state_handler: Creating insecure state channel for localhost:36511.
19/11/16 01:30:25 INFO create_state_handler: State channel established.
19/11/16 01:30:25 INFO create_data_channel: Creating client data channel for localhost:42531
19/11/16 01:30:25 INFO GrpcDataService: Beam Fn Data client connected.
19/11/16 01:30:25 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/16 01:30:25 INFO run: No more requests from control plane
19/11/16 01:30:25 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/16 01:30:25 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 01:30:25 INFO close: Closing all cached grpc data channels.
19/11/16 01:30:25 INFO close: Closing all cached gRPC state handlers.
19/11/16 01:30:25 INFO run: Done consuming work.
19/11/16 01:30:25 INFO main: Python sdk harness exiting.
19/11/16 01:30:25 INFO GrpcLoggingService: Logging client hanged up.
19/11/16 01:30:25 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 01:30:25 INFO Executor: Finished task 1.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/16 01:30:25 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 160, localhost, executor driver, partition 0, PROCESS_LOCAL, 7977 bytes)
19/11/16 01:30:25 INFO Executor: Running task 0.0 in stage 132.0 (TID 160)
19/11/16 01:30:25 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 159) in 890 ms on localhost (executor driver) (1/2)
19/11/16 01:30:25 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestfMusLv/job_06acddf1-3f88-4f92-b7f5-9cdcebcd1411/MANIFEST
19/11/16 01:30:25 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestfMusLv/job_06acddf1-3f88-4f92-b7f5-9cdcebcd1411/MANIFEST -> 0 artifacts
19/11/16 01:30:26 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/16 01:30:26 INFO main: Logging handler created.
19/11/16 01:30:26 INFO start: Status HTTP server running at localhost:36303
19/11/16 01:30:26 INFO main: semi_persistent_directory: /tmp
19/11/16 01:30:26 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/16 01:30:26 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573867821.76_d29add25-6bc9-4aa8-a01d-c144c395cf40', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/16 01:30:26 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573867821.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60259'}
19/11/16 01:30:26 INFO __init__: Creating state cache with size 0
19/11/16 01:30:26 INFO __init__: Creating insecure control channel for localhost:32891.
19/11/16 01:30:26 INFO __init__: Control channel established.
19/11/16 01:30:26 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/16 01:30:26 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/16 01:30:26 INFO create_state_handler: Creating insecure state channel for localhost:40155.
19/11/16 01:30:26 INFO create_state_handler: State channel established.
19/11/16 01:30:26 INFO create_data_channel: Creating client data channel for localhost:41895
19/11/16 01:30:26 INFO GrpcDataService: Beam Fn Data client connected.
19/11/16 01:30:26 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/16 01:30:26 INFO run: No more requests from control plane
19/11/16 01:30:26 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/16 01:30:26 INFO close: Closing all cached grpc data channels.
19/11/16 01:30:26 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 01:30:26 INFO close: Closing all cached gRPC state handlers.
19/11/16 01:30:26 INFO run: Done consuming work.
19/11/16 01:30:26 INFO main: Python sdk harness exiting.
19/11/16 01:30:26 INFO GrpcLoggingService: Logging client hanged up.
19/11/16 01:30:26 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 01:30:26 INFO Executor: Finished task 0.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/16 01:30:26 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 160) in 831 ms on localhost (executor driver) (2/2)
19/11/16 01:30:26 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/16 01:30:26 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.727 s
19/11/16 01:30:26 INFO DAGScheduler: looking for newly runnable stages
19/11/16 01:30:26 INFO DAGScheduler: running: Set()
19/11/16 01:30:26 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/16 01:30:26 INFO DAGScheduler: failed: Set()
19/11/16 01:30:26 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/16 01:30:26 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/16 01:30:26 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.3 KB, free 13.5 GB)
19/11/16 01:30:26 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:34037 (size: 12.3 KB, free: 13.5 GB)
19/11/16 01:30:26 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/16 01:30:26 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/16 01:30:26 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/16 01:30:26 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/16 01:30:26 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/16 01:30:26 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/16 01:30:26 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
19/11/16 01:30:26 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestfMusLv/job_06acddf1-3f88-4f92-b7f5-9cdcebcd1411/MANIFEST
19/11/16 01:30:26 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestfMusLv/job_06acddf1-3f88-4f92-b7f5-9cdcebcd1411/MANIFEST -> 0 artifacts
19/11/16 01:30:26 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/16 01:30:26 INFO main: Logging handler created.
19/11/16 01:30:26 INFO start: Status HTTP server running at localhost:35669
19/11/16 01:30:26 INFO main: semi_persistent_directory: /tmp
19/11/16 01:30:26 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/16 01:30:26 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573867821.76_d29add25-6bc9-4aa8-a01d-c144c395cf40', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/16 01:30:26 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573867821.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60259'}
19/11/16 01:30:26 INFO __init__: Creating state cache with size 0
19/11/16 01:30:26 INFO __init__: Creating insecure control channel for localhost:43615.
19/11/16 01:30:26 INFO __init__: Control channel established.
19/11/16 01:30:26 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/16 01:30:26 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/16 01:30:26 INFO create_state_handler: Creating insecure state channel for localhost:40409.
19/11/16 01:30:26 INFO create_state_handler: State channel established.
19/11/16 01:30:26 INFO create_data_channel: Creating client data channel for localhost:38183
19/11/16 01:30:26 INFO GrpcDataService: Beam Fn Data client connected.
19/11/16 01:30:26 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/16 01:30:26 INFO run: No more requests from control plane
19/11/16 01:30:26 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/16 01:30:26 INFO close: Closing all cached grpc data channels.
19/11/16 01:30:26 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 01:30:26 INFO close: Closing all cached gRPC state handlers.
19/11/16 01:30:26 INFO run: Done consuming work.
19/11/16 01:30:26 INFO main: Python sdk harness exiting.
19/11/16 01:30:26 INFO GrpcLoggingService: Logging client hanged up.
19/11/16 01:30:27 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 01:30:27 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/16 01:30:27 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 850 ms on localhost (executor driver) (1/1)
19/11/16 01:30:27 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/16 01:30:27 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.856 s
19/11/16 01:30:27 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.275877 s
19/11/16 01:30:27 INFO SparkPipelineRunner: Job test_windowing_1573867821.76_d29add25-6bc9-4aa8-a01d-c144c395cf40 finished.
19/11/16 01:30:27 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/16 01:30:27 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestfMusLv/job_06acddf1-3f88-4f92-b7f5-9cdcebcd1411/MANIFEST has 0 artifact locations
19/11/16 01:30:27 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestfMusLv/job_06acddf1-3f88-4f92-b7f5-9cdcebcd1411/
INFO:root:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 229, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 419, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 326, in test_pardo_timers
==================== Timed out after 60 seconds. ====================
    assert_that(actual, equal_to(expected))

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 419, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 140646320998144)>

# Thread: <Thread(Thread-120, started daemon 140646304212736)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140647100229376)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140646220297984)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
# Thread: <Thread(Thread-126, started daemon 140646211905280)>
BaseException: Timed out after 60 seconds.


# Thread: <Thread(Thread-120, started daemon 140646304212736)>

======================================================================
# Thread: <Thread(wait_until_finish_read, started daemon 140646320998144)>

ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <_MainThread(MainThread, started 140647100229376)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 497, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 429, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1573867812.54_4d152cfa-26b3-4c50-b95d-824d280fa7e3 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 298.148s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 25s
59 actionable tasks: 47 executed, 12 from cache

Publishing build scan...
https://gradle.com/s/glztu6quupibs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1551

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1551/display/redirect>

Changes:


------------------------------------------
[...truncated 1.66 MB...]
19/11/16 00:34:25 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/16 00:34:25 INFO DAGScheduler: failed: Set()
19/11/16 00:34:25 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/16 00:34:25 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/16 00:34:25 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.9 KB, free 13.5 GB)
19/11/16 00:34:25 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:35093 (size: 22.9 KB, free: 13.5 GB)
19/11/16 00:34:25 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/16 00:34:25 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/16 00:34:25 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/16 00:34:25 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 159, localhost, executor driver, partition 1, NODE_LOCAL, 7760 bytes)
19/11/16 00:34:25 INFO Executor: Running task 1.0 in stage 132.0 (TID 159)
19/11/16 00:34:25 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/16 00:34:25 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/16 00:34:25 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestOoHZTc/job_282c8710-8703-4196-b26c-e3b05cb795db/MANIFEST
19/11/16 00:34:25 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestOoHZTc/job_282c8710-8703-4196-b26c-e3b05cb795db/MANIFEST -> 0 artifacts
19/11/16 00:34:26 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/16 00:34:26 INFO main: Logging handler created.
19/11/16 00:34:26 INFO start: Status HTTP server running at localhost:41725
19/11/16 00:34:26 INFO main: semi_persistent_directory: /tmp
19/11/16 00:34:26 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/16 00:34:26 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573864462.93_1f1ff1c4-dd66-4fc9-9c04-983c18d54b06', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/16 00:34:26 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573864462.93', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56979'}
19/11/16 00:34:26 INFO __init__: Creating state cache with size 0
19/11/16 00:34:26 INFO __init__: Creating insecure control channel for localhost:34585.
19/11/16 00:34:26 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/16 00:34:26 INFO __init__: Control channel established.
19/11/16 00:34:26 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/16 00:34:26 INFO create_state_handler: Creating insecure state channel for localhost:40411.
19/11/16 00:34:26 INFO create_state_handler: State channel established.
19/11/16 00:34:26 INFO create_data_channel: Creating client data channel for localhost:36105
19/11/16 00:34:26 INFO GrpcDataService: Beam Fn Data client connected.
19/11/16 00:34:26 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/16 00:34:26 INFO run: No more requests from control plane
19/11/16 00:34:26 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/16 00:34:26 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 00:34:26 INFO close: Closing all cached grpc data channels.
19/11/16 00:34:26 INFO close: Closing all cached gRPC state handlers.
19/11/16 00:34:26 INFO run: Done consuming work.
19/11/16 00:34:26 INFO main: Python sdk harness exiting.
19/11/16 00:34:26 INFO GrpcLoggingService: Logging client hanged up.
19/11/16 00:34:26 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 00:34:26 INFO Executor: Finished task 1.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/16 00:34:26 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 160, localhost, executor driver, partition 0, PROCESS_LOCAL, 7977 bytes)
19/11/16 00:34:26 INFO Executor: Running task 0.0 in stage 132.0 (TID 160)
19/11/16 00:34:26 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 159) in 931 ms on localhost (executor driver) (1/2)
19/11/16 00:34:26 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestOoHZTc/job_282c8710-8703-4196-b26c-e3b05cb795db/MANIFEST
19/11/16 00:34:26 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestOoHZTc/job_282c8710-8703-4196-b26c-e3b05cb795db/MANIFEST -> 0 artifacts
19/11/16 00:34:27 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/16 00:34:27 INFO main: Logging handler created.
19/11/16 00:34:27 INFO start: Status HTTP server running at localhost:35143
19/11/16 00:34:27 INFO main: semi_persistent_directory: /tmp
19/11/16 00:34:27 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/16 00:34:27 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573864462.93_1f1ff1c4-dd66-4fc9-9c04-983c18d54b06', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/16 00:34:27 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573864462.93', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56979'}
19/11/16 00:34:27 INFO __init__: Creating state cache with size 0
19/11/16 00:34:27 INFO __init__: Creating insecure control channel for localhost:46865.
19/11/16 00:34:27 INFO __init__: Control channel established.
19/11/16 00:34:27 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/16 00:34:27 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/16 00:34:27 INFO create_state_handler: Creating insecure state channel for localhost:34959.
19/11/16 00:34:27 INFO create_state_handler: State channel established.
19/11/16 00:34:27 INFO create_data_channel: Creating client data channel for localhost:34133
19/11/16 00:34:27 INFO GrpcDataService: Beam Fn Data client connected.
19/11/16 00:34:27 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/16 00:34:27 INFO run: No more requests from control plane
19/11/16 00:34:27 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/16 00:34:27 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 00:34:27 INFO close: Closing all cached grpc data channels.
19/11/16 00:34:27 INFO close: Closing all cached gRPC state handlers.
19/11/16 00:34:27 INFO run: Done consuming work.
19/11/16 00:34:27 INFO main: Python sdk harness exiting.
19/11/16 00:34:27 INFO GrpcLoggingService: Logging client hanged up.
19/11/16 00:34:27 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 00:34:27 INFO Executor: Finished task 0.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/16 00:34:27 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 160) in 1040 ms on localhost (executor driver) (2/2)
19/11/16 00:34:27 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/16 00:34:27 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.977 s
19/11/16 00:34:27 INFO DAGScheduler: looking for newly runnable stages
19/11/16 00:34:27 INFO DAGScheduler: running: Set()
19/11/16 00:34:27 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/16 00:34:27 INFO DAGScheduler: failed: Set()
19/11/16 00:34:27 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/16 00:34:27 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/16 00:34:27 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.4 KB, free 13.5 GB)
19/11/16 00:34:27 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:35093 (size: 12.4 KB, free: 13.5 GB)
19/11/16 00:34:27 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/16 00:34:27 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/16 00:34:27 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/16 00:34:27 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/16 00:34:27 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/16 00:34:27 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/16 00:34:27 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/16 00:34:27 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestOoHZTc/job_282c8710-8703-4196-b26c-e3b05cb795db/MANIFEST
19/11/16 00:34:27 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestOoHZTc/job_282c8710-8703-4196-b26c-e3b05cb795db/MANIFEST -> 0 artifacts
19/11/16 00:34:28 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/16 00:34:28 INFO main: Logging handler created.
19/11/16 00:34:28 INFO main: semi_persistent_directory: /tmp
19/11/16 00:34:28 INFO start: Status HTTP server running at localhost:42437
19/11/16 00:34:28 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/16 00:34:28 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573864462.93_1f1ff1c4-dd66-4fc9-9c04-983c18d54b06', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/16 00:34:28 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573864462.93', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56979'}
19/11/16 00:34:28 INFO __init__: Creating state cache with size 0
19/11/16 00:34:28 INFO __init__: Creating insecure control channel for localhost:35487.
19/11/16 00:34:28 INFO __init__: Control channel established.
19/11/16 00:34:28 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/16 00:34:28 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/16 00:34:28 INFO create_state_handler: Creating insecure state channel for localhost:37791.
19/11/16 00:34:28 INFO create_state_handler: State channel established.
19/11/16 00:34:28 INFO create_data_channel: Creating client data channel for localhost:36053
19/11/16 00:34:28 INFO GrpcDataService: Beam Fn Data client connected.
19/11/16 00:34:28 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/16 00:34:28 INFO run: No more requests from control plane
19/11/16 00:34:28 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/16 00:34:28 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 00:34:28 INFO close: Closing all cached grpc data channels.
19/11/16 00:34:28 INFO close: Closing all cached gRPC state handlers.
19/11/16 00:34:28 INFO run: Done consuming work.
19/11/16 00:34:28 INFO main: Python sdk harness exiting.
19/11/16 00:34:28 INFO GrpcLoggingService: Logging client hanged up.
19/11/16 00:34:28 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 00:34:28 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/16 00:34:28 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 964 ms on localhost (executor driver) (1/1)
19/11/16 00:34:28 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/16 00:34:28 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.972 s
19/11/16 00:34:28 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.743844 s
19/11/16 00:34:28 INFO SparkPipelineRunner: Job test_windowing_1573864462.93_1f1ff1c4-dd66-4fc9-9c04-983c18d54b06 finished.
19/11/16 00:34:28 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/16 00:34:28 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestOoHZTc/job_282c8710-8703-4196-b26c-e3b05cb795db/MANIFEST has 0 artifact locations
19/11/16 00:34:28 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestOoHZTc/job_282c8710-8703-4196-b26c-e3b05cb795db/
INFO:root:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 229, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 419, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
==================== Timed out after 60 seconds. ====================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)

----------------------------------------------------------------------
# Thread: <Thread(wait_until_finish_read, started daemon 139833378010880)>

# Thread: <Thread(Thread-120, started daemon 139833386403584)>

# Thread: <_MainThread(MainThread, started 139834165843712)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139833360439040)>

# Thread: <Thread(Thread-126, started daemon 139833352046336)>

# Thread: <_MainThread(MainThread, started 139834165843712)>

# Thread: <Thread(wait_until_finish_read, started daemon 139833378010880)>

Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 326, in test_pardo_timers
    assert_that(actual, equal_to(expected))
# Thread: <Thread(Thread-120, started daemon 139833386403584)>
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 419, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 497, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 429, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1573864453.27_0c6b5707-0d18-4f69-87fb-1fc2290418f9 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 327.764s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 16s
59 actionable tasks: 47 executed, 12 from cache

Publishing build scan...
https://gradle.com/s/uegb6d3gnfcog

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1550

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1550/display/redirect?page=changes>

Changes:

[altay] Separate pydocs generation from py2 precommit tests.

[altay] Add settings file

[robertwb] Merge pull request #10117 [BEAM-8335] Add service and tagged output


------------------------------------------
[...truncated 1.66 MB...]
19/11/16 00:01:39 INFO main: Logging handler created.
19/11/16 00:01:39 INFO start: Status HTTP server running at localhost:40713
19/11/16 00:01:39 INFO main: semi_persistent_directory: /tmp
19/11/16 00:01:39 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/16 00:01:39 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573862496.64_f5872907-68c3-451a-a822-a577584d5ff2', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/16 00:01:39 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573862496.64', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:51711'}
19/11/16 00:01:39 INFO __init__: Creating state cache with size 0
19/11/16 00:01:39 INFO __init__: Creating insecure control channel for localhost:37591.
19/11/16 00:01:39 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/16 00:01:39 INFO __init__: Control channel established.
19/11/16 00:01:39 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/16 00:01:39 INFO create_state_handler: Creating insecure state channel for localhost:41223.
19/11/16 00:01:39 INFO create_state_handler: State channel established.
19/11/16 00:01:39 INFO create_data_channel: Creating client data channel for localhost:46061
19/11/16 00:01:39 INFO GrpcDataService: Beam Fn Data client connected.
19/11/16 00:01:39 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/16 00:01:39 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/16 00:01:39 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/16 00:01:39 INFO run: No more requests from control plane
19/11/16 00:01:39 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/16 00:01:39 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 00:01:39 INFO close: Closing all cached grpc data channels.
19/11/16 00:01:39 INFO close: Closing all cached gRPC state handlers.
19/11/16 00:01:39 INFO run: Done consuming work.
19/11/16 00:01:39 INFO main: Python sdk harness exiting.
19/11/16 00:01:39 INFO GrpcLoggingService: Logging client hanged up.
19/11/16 00:01:39 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 00:01:39 INFO Executor: Finished task 0.0 in stage 131.0 (TID 158). 12763 bytes result sent to driver
19/11/16 00:01:39 INFO TaskSetManager: Finished task 0.0 in stage 131.0 (TID 158) in 842 ms on localhost (executor driver) (1/1)
19/11/16 00:01:39 INFO TaskSchedulerImpl: Removed TaskSet 131.0, whose tasks have all completed, from pool 
19/11/16 00:01:39 INFO DAGScheduler: ShuffleMapStage 131 (mapToPair at GroupCombineFunctions.java:55) finished in 0.847 s
19/11/16 00:01:39 INFO DAGScheduler: looking for newly runnable stages
19/11/16 00:01:39 INFO DAGScheduler: running: Set()
19/11/16 00:01:39 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/16 00:01:39 INFO DAGScheduler: failed: Set()
19/11/16 00:01:39 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/16 00:01:39 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/16 00:01:39 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.0 KB, free 13.5 GB)
19/11/16 00:01:39 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:43921 (size: 22.0 KB, free: 13.5 GB)
19/11/16 00:01:39 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/16 00:01:39 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/16 00:01:39 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/16 00:01:39 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 159, localhost, executor driver, partition 0, NODE_LOCAL, 7760 bytes)
19/11/16 00:01:39 INFO Executor: Running task 0.0 in stage 132.0 (TID 159)
19/11/16 00:01:39 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/16 00:01:39 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/16 00:01:39 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestxdnzCe/job_9cbc36d1-c7b9-4646-916c-c3dd82242eb7/MANIFEST
19/11/16 00:01:39 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestxdnzCe/job_9cbc36d1-c7b9-4646-916c-c3dd82242eb7/MANIFEST -> 0 artifacts
19/11/16 00:01:39 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/16 00:01:39 INFO main: Logging handler created.
19/11/16 00:01:39 INFO start: Status HTTP server running at localhost:34487
19/11/16 00:01:39 INFO main: semi_persistent_directory: /tmp
19/11/16 00:01:39 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/16 00:01:39 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573862496.64_f5872907-68c3-451a-a822-a577584d5ff2', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/16 00:01:39 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573862496.64', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:51711'}
19/11/16 00:01:39 INFO __init__: Creating state cache with size 0
19/11/16 00:01:39 INFO __init__: Creating insecure control channel for localhost:38757.
19/11/16 00:01:39 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/16 00:01:39 INFO __init__: Control channel established.
19/11/16 00:01:39 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/16 00:01:39 INFO create_state_handler: Creating insecure state channel for localhost:36207.
19/11/16 00:01:39 INFO create_state_handler: State channel established.
19/11/16 00:01:39 INFO create_data_channel: Creating client data channel for localhost:42533
19/11/16 00:01:39 INFO GrpcDataService: Beam Fn Data client connected.
19/11/16 00:01:40 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/16 00:01:40 INFO run: No more requests from control plane
19/11/16 00:01:40 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/16 00:01:40 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 00:01:40 INFO close: Closing all cached grpc data channels.
19/11/16 00:01:40 INFO close: Closing all cached gRPC state handlers.
19/11/16 00:01:40 INFO run: Done consuming work.
19/11/16 00:01:40 INFO main: Python sdk harness exiting.
19/11/16 00:01:40 INFO GrpcLoggingService: Logging client hanged up.
19/11/16 00:01:40 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 00:01:40 INFO Executor: Finished task 0.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/16 00:01:40 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 160, localhost, executor driver, partition 1, PROCESS_LOCAL, 7977 bytes)
19/11/16 00:01:40 INFO Executor: Running task 1.0 in stage 132.0 (TID 160)
19/11/16 00:01:40 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 159) in 888 ms on localhost (executor driver) (1/2)
19/11/16 00:01:40 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestxdnzCe/job_9cbc36d1-c7b9-4646-916c-c3dd82242eb7/MANIFEST
19/11/16 00:01:40 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestxdnzCe/job_9cbc36d1-c7b9-4646-916c-c3dd82242eb7/MANIFEST -> 0 artifacts
19/11/16 00:01:40 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/16 00:01:40 INFO main: Logging handler created.
19/11/16 00:01:40 INFO start: Status HTTP server running at localhost:44357
19/11/16 00:01:40 INFO main: semi_persistent_directory: /tmp
19/11/16 00:01:40 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/16 00:01:40 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573862496.64_f5872907-68c3-451a-a822-a577584d5ff2', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/16 00:01:40 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573862496.64', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:51711'}
19/11/16 00:01:40 INFO __init__: Creating state cache with size 0
19/11/16 00:01:40 INFO __init__: Creating insecure control channel for localhost:41463.
19/11/16 00:01:40 INFO __init__: Control channel established.
19/11/16 00:01:40 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/16 00:01:40 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/16 00:01:40 INFO create_state_handler: Creating insecure state channel for localhost:39487.
19/11/16 00:01:40 INFO create_state_handler: State channel established.
19/11/16 00:01:40 INFO create_data_channel: Creating client data channel for localhost:36301
19/11/16 00:01:40 INFO GrpcDataService: Beam Fn Data client connected.
19/11/16 00:01:40 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/16 00:01:40 INFO run: No more requests from control plane
19/11/16 00:01:40 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/16 00:01:40 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 00:01:40 INFO close: Closing all cached grpc data channels.
19/11/16 00:01:40 INFO close: Closing all cached gRPC state handlers.
19/11/16 00:01:40 INFO run: Done consuming work.
19/11/16 00:01:40 INFO main: Python sdk harness exiting.
19/11/16 00:01:40 INFO GrpcLoggingService: Logging client hanged up.
19/11/16 00:01:40 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 00:01:40 INFO Executor: Finished task 1.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/16 00:01:40 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 160) in 801 ms on localhost (executor driver) (2/2)
19/11/16 00:01:40 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/16 00:01:40 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.695 s
19/11/16 00:01:40 INFO DAGScheduler: looking for newly runnable stages
19/11/16 00:01:40 INFO DAGScheduler: running: Set()
19/11/16 00:01:40 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/16 00:01:40 INFO DAGScheduler: failed: Set()
19/11/16 00:01:40 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/16 00:01:40 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/16 00:01:40 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.3 KB, free 13.5 GB)
19/11/16 00:01:40 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:43921 (size: 12.3 KB, free: 13.5 GB)
19/11/16 00:01:40 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/16 00:01:40 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/16 00:01:40 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/16 00:01:40 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/16 00:01:40 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/16 00:01:40 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/16 00:01:40 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/16 00:01:41 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestxdnzCe/job_9cbc36d1-c7b9-4646-916c-c3dd82242eb7/MANIFEST
19/11/16 00:01:41 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestxdnzCe/job_9cbc36d1-c7b9-4646-916c-c3dd82242eb7/MANIFEST -> 0 artifacts
19/11/16 00:01:41 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/16 00:01:41 INFO main: Logging handler created.
19/11/16 00:01:41 INFO start: Status HTTP server running at localhost:33061
19/11/16 00:01:41 INFO main: semi_persistent_directory: /tmp
19/11/16 00:01:41 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/16 00:01:41 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573862496.64_f5872907-68c3-451a-a822-a577584d5ff2', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/16 00:01:41 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573862496.64', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:51711'}
19/11/16 00:01:41 INFO __init__: Creating state cache with size 0
19/11/16 00:01:41 INFO __init__: Creating insecure control channel for localhost:33399.
19/11/16 00:01:41 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/16 00:01:41 INFO __init__: Control channel established.
19/11/16 00:01:41 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/16 00:01:41 INFO create_state_handler: Creating insecure state channel for localhost:46681.
19/11/16 00:01:41 INFO create_state_handler: State channel established.
19/11/16 00:01:41 INFO create_data_channel: Creating client data channel for localhost:44161
19/11/16 00:01:41 INFO GrpcDataService: Beam Fn Data client connected.
19/11/16 00:01:41 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/16 00:01:41 INFO run: No more requests from control plane
19/11/16 00:01:41 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/16 00:01:41 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 00:01:41 INFO close: Closing all cached grpc data channels.
19/11/16 00:01:41 INFO close: Closing all cached gRPC state handlers.
19/11/16 00:01:41 INFO run: Done consuming work.
19/11/16 00:01:41 INFO main: Python sdk harness exiting.
19/11/16 00:01:41 INFO GrpcLoggingService: Logging client hanged up.
19/11/16 00:01:41 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/16 00:01:41 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 12013 bytes result sent to driver
19/11/16 00:01:41 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 834 ms on localhost (executor driver) (1/1)
19/11/16 00:01:41 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/16 00:01:41 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.840 s
19/11/16 00:01:41 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.186680 s
19/11/16 00:01:41 INFO SparkPipelineRunner: Job test_windowing_1573862496.64_f5872907-68c3-451a-a822-a577584d5ff2 finished.
19/11/16 00:01:41 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/16 00:01:41 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestxdnzCe/job_9cbc36d1-c7b9-4646-916c-c3dd82242eb7/MANIFEST has 0 artifact locations
19/11/16 00:01:41 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestxdnzCe/job_9cbc36d1-c7b9-4646-916c-c3dd82242eb7/
INFO:root:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 229, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 419, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 497, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
==================== Timed out after 60 seconds. ====================

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 429, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(wait_until_finish_read, started daemon 140458341013248)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1573862487.97_97f9e918-a574-4ce4-8d9e-97c83d86ecdb failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <Thread(Thread-119, started daemon 140458612184832)>

----------------------------------------------------------------------
Ran 38 tests in 280.312s

# Thread: <_MainThread(MainThread, started 140459127576320)>
FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 10s
59 actionable tasks: 47 executed, 12 from cache

Publishing build scan...
https://gradle.com/s/z75ksun2yneja

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1549

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1549/display/redirect?page=changes>

Changes:

[wenjialiu] [BEAM-8575] Test a customized window fn work as expected

[wenjialiu] fixup

[wenjialiu] fixup


------------------------------------------
[...truncated 1.65 MB...]
19/11/15 23:06:25 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/15 23:06:25 INFO DAGScheduler: failed: Set()
19/11/15 23:06:25 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/15 23:06:25 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/15 23:06:25 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.9 KB, free 13.5 GB)
19/11/15 23:06:25 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:46317 (size: 22.9 KB, free: 13.5 GB)
19/11/15 23:06:25 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/15 23:06:25 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/15 23:06:25 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/15 23:06:25 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 159, localhost, executor driver, partition 1, NODE_LOCAL, 7760 bytes)
19/11/15 23:06:25 INFO Executor: Running task 1.0 in stage 132.0 (TID 159)
19/11/15 23:06:25 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/15 23:06:25 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
19/11/15 23:06:25 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestHgJKPH/job_bd7ff40e-ee08-4875-a611-66dcc6226f4c/MANIFEST
19/11/15 23:06:25 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestHgJKPH/job_bd7ff40e-ee08-4875-a611-66dcc6226f4c/MANIFEST -> 0 artifacts
19/11/15 23:06:26 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 23:06:26 INFO main: Logging handler created.
19/11/15 23:06:26 INFO start: Status HTTP server running at localhost:43653
19/11/15 23:06:26 INFO main: semi_persistent_directory: /tmp
19/11/15 23:06:26 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 23:06:26 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573859182.19_0c98abcd-9c40-44be-81bb-bc9a031df404', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 23:06:26 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573859182.19', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42149'}
19/11/15 23:06:26 INFO __init__: Creating state cache with size 0
19/11/15 23:06:26 INFO __init__: Creating insecure control channel for localhost:35001.
19/11/15 23:06:26 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/15 23:06:26 INFO __init__: Control channel established.
19/11/15 23:06:26 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/15 23:06:26 INFO create_state_handler: Creating insecure state channel for localhost:45761.
19/11/15 23:06:26 INFO create_state_handler: State channel established.
19/11/15 23:06:26 INFO create_data_channel: Creating client data channel for localhost:41525
19/11/15 23:06:26 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 23:06:26 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 23:06:26 INFO run: No more requests from control plane
19/11/15 23:06:26 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 23:06:26 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 23:06:26 INFO close: Closing all cached grpc data channels.
19/11/15 23:06:26 INFO close: Closing all cached gRPC state handlers.
19/11/15 23:06:26 INFO run: Done consuming work.
19/11/15 23:06:26 INFO main: Python sdk harness exiting.
19/11/15 23:06:26 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 23:06:26 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 23:06:26 INFO Executor: Finished task 1.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/15 23:06:26 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 160, localhost, executor driver, partition 0, PROCESS_LOCAL, 7977 bytes)
19/11/15 23:06:26 INFO Executor: Running task 0.0 in stage 132.0 (TID 160)
19/11/15 23:06:26 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 159) in 987 ms on localhost (executor driver) (1/2)
19/11/15 23:06:26 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestHgJKPH/job_bd7ff40e-ee08-4875-a611-66dcc6226f4c/MANIFEST
19/11/15 23:06:26 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestHgJKPH/job_bd7ff40e-ee08-4875-a611-66dcc6226f4c/MANIFEST -> 0 artifacts
19/11/15 23:06:27 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 23:06:27 INFO main: Logging handler created.
19/11/15 23:06:27 INFO start: Status HTTP server running at localhost:35917
19/11/15 23:06:27 INFO main: semi_persistent_directory: /tmp
19/11/15 23:06:27 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 23:06:27 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573859182.19_0c98abcd-9c40-44be-81bb-bc9a031df404', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 23:06:27 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573859182.19', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42149'}
19/11/15 23:06:27 INFO __init__: Creating state cache with size 0
19/11/15 23:06:27 INFO __init__: Creating insecure control channel for localhost:35673.
19/11/15 23:06:27 INFO __init__: Control channel established.
19/11/15 23:06:27 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/15 23:06:27 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/15 23:06:27 INFO create_state_handler: Creating insecure state channel for localhost:45299.
19/11/15 23:06:27 INFO create_state_handler: State channel established.
19/11/15 23:06:27 INFO create_data_channel: Creating client data channel for localhost:37407
19/11/15 23:06:27 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 23:06:27 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 23:06:27 INFO run: No more requests from control plane
19/11/15 23:06:27 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 23:06:27 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 23:06:27 INFO close: Closing all cached grpc data channels.
19/11/15 23:06:27 INFO close: Closing all cached gRPC state handlers.
19/11/15 23:06:27 INFO run: Done consuming work.
19/11/15 23:06:27 INFO main: Python sdk harness exiting.
19/11/15 23:06:27 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 23:06:27 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 23:06:27 INFO Executor: Finished task 0.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/15 23:06:27 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 160) in 929 ms on localhost (executor driver) (2/2)
19/11/15 23:06:27 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/15 23:06:27 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.922 s
19/11/15 23:06:27 INFO DAGScheduler: looking for newly runnable stages
19/11/15 23:06:27 INFO DAGScheduler: running: Set()
19/11/15 23:06:27 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/15 23:06:27 INFO DAGScheduler: failed: Set()
19/11/15 23:06:27 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/15 23:06:27 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/15 23:06:27 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.4 KB, free 13.5 GB)
19/11/15 23:06:27 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:46317 (size: 12.4 KB, free: 13.5 GB)
19/11/15 23:06:27 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/15 23:06:27 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/15 23:06:27 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/15 23:06:27 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/15 23:06:27 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/15 23:06:27 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/15 23:06:27 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/15 23:06:27 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestHgJKPH/job_bd7ff40e-ee08-4875-a611-66dcc6226f4c/MANIFEST
19/11/15 23:06:27 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestHgJKPH/job_bd7ff40e-ee08-4875-a611-66dcc6226f4c/MANIFEST -> 0 artifacts
19/11/15 23:06:27 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 23:06:27 INFO main: Logging handler created.
19/11/15 23:06:27 INFO start: Status HTTP server running at localhost:40777
19/11/15 23:06:27 INFO main: semi_persistent_directory: /tmp
19/11/15 23:06:27 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 23:06:27 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573859182.19_0c98abcd-9c40-44be-81bb-bc9a031df404', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 23:06:27 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573859182.19', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42149'}
19/11/15 23:06:27 INFO __init__: Creating state cache with size 0
19/11/15 23:06:27 INFO __init__: Creating insecure control channel for localhost:43885.
19/11/15 23:06:27 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/15 23:06:27 INFO __init__: Control channel established.
19/11/15 23:06:27 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/15 23:06:27 INFO create_state_handler: Creating insecure state channel for localhost:44087.
19/11/15 23:06:27 INFO create_state_handler: State channel established.
19/11/15 23:06:28 INFO create_data_channel: Creating client data channel for localhost:43639
19/11/15 23:06:28 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 23:06:28 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 23:06:28 INFO run: No more requests from control plane
19/11/15 23:06:28 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 23:06:28 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 23:06:28 INFO close: Closing all cached grpc data channels.
19/11/15 23:06:28 INFO close: Closing all cached gRPC state handlers.
19/11/15 23:06:28 INFO run: Done consuming work.
19/11/15 23:06:28 INFO main: Python sdk harness exiting.
19/11/15 23:06:28 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 23:06:28 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 23:06:28 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/15 23:06:28 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 948 ms on localhost (executor driver) (1/1)
19/11/15 23:06:28 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/15 23:06:28 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.954 s
19/11/15 23:06:28 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.861748 s
19/11/15 23:06:28 INFO SparkPipelineRunner: Job test_windowing_1573859182.19_0c98abcd-9c40-44be-81bb-bc9a031df404 finished.
19/11/15 23:06:28 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/15 23:06:28 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestHgJKPH/job_bd7ff40e-ee08-4875-a611-66dcc6226f4c/MANIFEST has 0 artifact locations
19/11/15 23:06:28 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestHgJKPH/job_bd7ff40e-ee08-4875-a611-66dcc6226f4c/
INFO:root:Job state changed to DONE
.==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140175854270208)>

# Thread: <Thread(Thread-119, started daemon 140175837484800)>

# Thread: <_MainThread(MainThread, started 140176977422080)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140175820699392)>

# Thread: <Thread(Thread-125, started daemon 140175829092096)>

# Thread: <_MainThread(MainThread, started 140176977422080)>

# Thread: <Thread(Thread-119, started daemon 140175837484800)>

# Thread: <Thread(wait_until_finish_read, started daemon 140175854270208)>

======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 229, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 419, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 326, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 419, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 497, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 429, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1573859171.9_63cfb267-d53d-47a3-947c-c699980faab9 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 312.756s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 46s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/hijo7xnpl3hbk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1548

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1548/display/redirect?page=changes>

Changes:

[migryz] Reduce Java Examples Dataflow Precommit timeout


------------------------------------------
[...truncated 1.65 MB...]
19/11/15 22:13:51 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/15 22:13:51 INFO DAGScheduler: failed: Set()
19/11/15 22:13:51 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/15 22:13:51 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/15 22:13:51 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.0 KB, free 13.5 GB)
19/11/15 22:13:51 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:37437 (size: 22.0 KB, free: 13.5 GB)
19/11/15 22:13:51 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/15 22:13:51 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/15 22:13:51 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/15 22:13:51 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 159, localhost, executor driver, partition 0, NODE_LOCAL, 7760 bytes)
19/11/15 22:13:51 INFO Executor: Running task 0.0 in stage 132.0 (TID 159)
19/11/15 22:13:51 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/15 22:13:51 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
19/11/15 22:13:51 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestQQ4x5v/job_f781ee39-b283-42d4-bd12-b6281b23aa2a/MANIFEST
19/11/15 22:13:51 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestQQ4x5v/job_f781ee39-b283-42d4-bd12-b6281b23aa2a/MANIFEST -> 0 artifacts
19/11/15 22:13:51 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 22:13:51 INFO main: Logging handler created.
19/11/15 22:13:51 INFO start: Status HTTP server running at localhost:39949
19/11/15 22:13:51 INFO main: semi_persistent_directory: /tmp
19/11/15 22:13:51 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 22:13:51 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573856028.24_abe1b614-fa97-4a95-9f22-6ff973c586aa', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 22:13:51 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573856028.24', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:47593'}
19/11/15 22:13:51 INFO __init__: Creating state cache with size 0
19/11/15 22:13:51 INFO __init__: Creating insecure control channel for localhost:36211.
19/11/15 22:13:51 INFO __init__: Control channel established.
19/11/15 22:13:51 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/15 22:13:51 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/15 22:13:51 INFO create_state_handler: Creating insecure state channel for localhost:42015.
19/11/15 22:13:51 INFO create_state_handler: State channel established.
19/11/15 22:13:51 INFO create_data_channel: Creating client data channel for localhost:39437
19/11/15 22:13:51 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 22:13:51 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 22:13:51 INFO run: No more requests from control plane
19/11/15 22:13:51 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 22:13:51 INFO close: Closing all cached grpc data channels.
19/11/15 22:13:51 INFO close: Closing all cached gRPC state handlers.
19/11/15 22:13:51 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 22:13:51 INFO run: Done consuming work.
19/11/15 22:13:51 INFO main: Python sdk harness exiting.
19/11/15 22:13:51 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 22:13:52 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 22:13:52 INFO Executor: Finished task 0.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/15 22:13:52 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 160, localhost, executor driver, partition 1, PROCESS_LOCAL, 7977 bytes)
19/11/15 22:13:52 INFO Executor: Running task 1.0 in stage 132.0 (TID 160)
19/11/15 22:13:52 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 159) in 973 ms on localhost (executor driver) (1/2)
19/11/15 22:13:52 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestQQ4x5v/job_f781ee39-b283-42d4-bd12-b6281b23aa2a/MANIFEST
19/11/15 22:13:52 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestQQ4x5v/job_f781ee39-b283-42d4-bd12-b6281b23aa2a/MANIFEST -> 0 artifacts
19/11/15 22:13:52 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 22:13:52 INFO main: Logging handler created.
19/11/15 22:13:52 INFO start: Status HTTP server running at localhost:36557
19/11/15 22:13:52 INFO main: semi_persistent_directory: /tmp
19/11/15 22:13:52 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 22:13:52 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573856028.24_abe1b614-fa97-4a95-9f22-6ff973c586aa', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 22:13:52 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573856028.24', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:47593'}
19/11/15 22:13:52 INFO __init__: Creating state cache with size 0
19/11/15 22:13:52 INFO __init__: Creating insecure control channel for localhost:43937.
19/11/15 22:13:52 INFO __init__: Control channel established.
19/11/15 22:13:52 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/15 22:13:52 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/15 22:13:52 INFO create_state_handler: Creating insecure state channel for localhost:36761.
19/11/15 22:13:52 INFO create_state_handler: State channel established.
19/11/15 22:13:52 INFO create_data_channel: Creating client data channel for localhost:45285
19/11/15 22:13:52 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 22:13:52 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 22:13:52 INFO run: No more requests from control plane
19/11/15 22:13:52 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 22:13:52 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 22:13:52 INFO close: Closing all cached grpc data channels.
19/11/15 22:13:52 INFO close: Closing all cached gRPC state handlers.
19/11/15 22:13:52 INFO run: Done consuming work.
19/11/15 22:13:52 INFO main: Python sdk harness exiting.
19/11/15 22:13:52 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 22:13:52 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 22:13:52 INFO Executor: Finished task 1.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/15 22:13:52 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 160) in 889 ms on localhost (executor driver) (2/2)
19/11/15 22:13:52 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/15 22:13:52 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.868 s
19/11/15 22:13:52 INFO DAGScheduler: looking for newly runnable stages
19/11/15 22:13:52 INFO DAGScheduler: running: Set()
19/11/15 22:13:52 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/15 22:13:52 INFO DAGScheduler: failed: Set()
19/11/15 22:13:52 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/15 22:13:52 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/15 22:13:52 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.3 KB, free 13.5 GB)
19/11/15 22:13:52 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:37437 (size: 12.3 KB, free: 13.5 GB)
19/11/15 22:13:52 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/15 22:13:52 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/15 22:13:52 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/15 22:13:52 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/15 22:13:52 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/15 22:13:52 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/15 22:13:52 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/15 22:13:52 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestQQ4x5v/job_f781ee39-b283-42d4-bd12-b6281b23aa2a/MANIFEST
19/11/15 22:13:52 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestQQ4x5v/job_f781ee39-b283-42d4-bd12-b6281b23aa2a/MANIFEST -> 0 artifacts
19/11/15 22:13:53 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 22:13:53 INFO main: Logging handler created.
19/11/15 22:13:53 INFO start: Status HTTP server running at localhost:46251
19/11/15 22:13:53 INFO main: semi_persistent_directory: /tmp
19/11/15 22:13:53 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 22:13:53 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573856028.24_abe1b614-fa97-4a95-9f22-6ff973c586aa', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 22:13:53 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573856028.24', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:47593'}
19/11/15 22:13:53 INFO __init__: Creating state cache with size 0
19/11/15 22:13:53 INFO __init__: Creating insecure control channel for localhost:35855.
19/11/15 22:13:53 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/15 22:13:53 INFO __init__: Control channel established.
19/11/15 22:13:53 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/15 22:13:53 INFO create_state_handler: Creating insecure state channel for localhost:33651.
19/11/15 22:13:53 INFO create_state_handler: State channel established.
19/11/15 22:13:53 INFO create_data_channel: Creating client data channel for localhost:33787
19/11/15 22:13:53 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 22:13:53 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 22:13:53 INFO run: No more requests from control plane
19/11/15 22:13:53 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 22:13:53 INFO close: Closing all cached grpc data channels.
19/11/15 22:13:53 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 22:13:53 INFO close: Closing all cached gRPC state handlers.
19/11/15 22:13:53 INFO run: Done consuming work.
19/11/15 22:13:53 INFO main: Python sdk harness exiting.
19/11/15 22:13:53 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 22:13:53 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 22:13:53 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/15 22:13:53 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 933 ms on localhost (executor driver) (1/1)
19/11/15 22:13:53 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/15 22:13:53 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.941 s
19/11/15 22:13:53 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.622374 s
19/11/15 22:13:53 INFO SparkPipelineRunner: Job test_windowing_1573856028.24_abe1b614-fa97-4a95-9f22-6ff973c586aa finished.
19/11/15 22:13:53 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/15 22:13:53 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestQQ4x5v/job_f781ee39-b283-42d4-bd12-b6281b23aa2a/MANIFEST has 0 artifact locations
19/11/15 22:13:53 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestQQ4x5v/job_f781ee39-b283-42d4-bd12-b6281b23aa2a/
INFO:root:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 229, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 419, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
==================== Timed out after 60 seconds. ====================
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)

# Thread: <Thread(wait_until_finish_read, started daemon 140147181561600)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-115, started daemon 140147173168896)>

# Thread: <_MainThread(MainThread, started 140147960792832)>
==================== Timed out after 60 seconds. ====================

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
# Thread: <Thread(wait_until_finish_read, started daemon 140147155597056)>

----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-121, started daemon 140147147204352)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 326, in test_pardo_timers
    assert_that(actual, equal_to(expected))
# Thread: <Thread(Thread-115, started daemon 140147173168896)>

# Thread: <Thread(wait_until_finish_read, started daemon 140147181561600)>

  File "apache_beam/pipeline.py", line 436, in __exit__
# Thread: <_MainThread(MainThread, started 140147960792832)>
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 419, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 497, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 429, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1573856018.82_28d57d68-dcaa-4f17-8d4e-623a32d71d0e failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 315.141s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 40s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/dq7khcqn4xcj6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1547

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1547/display/redirect?page=changes>

Changes:

[ningk] [BEAM-8379] Cache Eviction


------------------------------------------
[...truncated 1.68 MB...]
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2551
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2831
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2545
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2932
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2558
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2828
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2886
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2534
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2717
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2681
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2525
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2849
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2754
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2564
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2644
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2611
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2933
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2568
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2846
19/11/15 21:27:25 INFO ContextCleaner: Cleaned shuffle 64
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2922
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2677
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2796
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2730
19/11/15 21:27:25 INFO ContextCleaner: Cleaned shuffle 65
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2688
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2734
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2770
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2656
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2626
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2916
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2539
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2587
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2894
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2543
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2764
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2647
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2755
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2792
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2909
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2737
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2850
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2928
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2576
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2777
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2657
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2690
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2768
19/11/15 21:27:25 INFO ContextCleaner: Cleaned accumulator 2766
19/11/15 21:27:26 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 21:27:26 INFO main: Logging handler created.
19/11/15 21:27:26 INFO start: Status HTTP server running at localhost:33041
19/11/15 21:27:26 INFO main: semi_persistent_directory: /tmp
19/11/15 21:27:26 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 21:27:26 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573853237.56_ee1d2b4b-1dfa-4200-9249-8f8c35c93389', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 21:27:26 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573853237.56', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36517'}
19/11/15 21:27:26 INFO __init__: Creating state cache with size 0
19/11/15 21:27:26 INFO __init__: Creating insecure control channel for localhost:46705.
19/11/15 21:27:26 INFO __init__: Control channel established.
19/11/15 21:27:26 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/15 21:27:26 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/15 21:27:26 INFO create_state_handler: Creating insecure state channel for localhost:33869.
19/11/15 21:27:26 INFO create_state_handler: State channel established.
19/11/15 21:27:26 INFO create_data_channel: Creating client data channel for localhost:42289
19/11/15 21:27:26 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 21:27:26 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 21:27:26 INFO run: No more requests from control plane
19/11/15 21:27:26 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 21:27:26 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 21:27:26 INFO close: Closing all cached grpc data channels.
19/11/15 21:27:26 INFO close: Closing all cached gRPC state handlers.
19/11/15 21:27:26 INFO run: Done consuming work.
19/11/15 21:27:26 INFO main: Python sdk harness exiting.
19/11/15 21:27:26 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 21:27:26 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 21:27:27 INFO Executor: Finished task 0.0 in stage 132.0 (TID 160). 13753 bytes result sent to driver
19/11/15 21:27:27 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 160) in 1641 ms on localhost (executor driver) (2/2)
19/11/15 21:27:27 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/15 21:27:27 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 4.894 s
19/11/15 21:27:27 INFO DAGScheduler: looking for newly runnable stages
19/11/15 21:27:27 INFO DAGScheduler: running: Set()
19/11/15 21:27:27 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/15 21:27:27 INFO DAGScheduler: failed: Set()
19/11/15 21:27:27 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/15 21:27:27 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/15 21:27:27 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.4 KB, free 13.5 GB)
19/11/15 21:27:27 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:38775 (size: 12.4 KB, free: 13.5 GB)
19/11/15 21:27:27 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/15 21:27:27 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/15 21:27:27 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/15 21:27:27 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/15 21:27:27 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/15 21:27:27 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/15 21:27:27 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/15 21:27:27 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest1fpj1E/job_38ace3b9-1b83-4d9b-a029-43f964b076a9/MANIFEST
19/11/15 21:27:27 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest1fpj1E/job_38ace3b9-1b83-4d9b-a029-43f964b076a9/MANIFEST -> 0 artifacts
19/11/15 21:27:28 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 21:27:28 INFO main: Logging handler created.
19/11/15 21:27:28 INFO start: Status HTTP server running at localhost:43305
19/11/15 21:27:28 INFO main: semi_persistent_directory: /tmp
19/11/15 21:27:28 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 21:27:28 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573853237.56_ee1d2b4b-1dfa-4200-9249-8f8c35c93389', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 21:27:28 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573853237.56', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36517'}
19/11/15 21:27:28 INFO __init__: Creating state cache with size 0
19/11/15 21:27:28 INFO __init__: Creating insecure control channel for localhost:39763.
19/11/15 21:27:28 INFO __init__: Control channel established.
19/11/15 21:27:28 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/15 21:27:28 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/15 21:27:28 INFO create_state_handler: Creating insecure state channel for localhost:33207.
19/11/15 21:27:28 INFO create_state_handler: State channel established.
19/11/15 21:27:28 INFO create_data_channel: Creating client data channel for localhost:36429
19/11/15 21:27:28 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 21:27:28 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 21:27:28 INFO run: No more requests from control plane
19/11/15 21:27:28 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 21:27:28 INFO close: Closing all cached grpc data channels.
19/11/15 21:27:28 INFO close: Closing all cached gRPC state handlers.
19/11/15 21:27:28 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 21:27:28 INFO run: Done consuming work.
19/11/15 21:27:28 INFO main: Python sdk harness exiting.
19/11/15 21:27:28 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 21:27:28 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 21:27:28 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/15 21:27:28 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 909 ms on localhost (executor driver) (1/1)
19/11/15 21:27:28 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/15 21:27:28 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.917 s
19/11/15 21:27:28 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 9.199294 s
19/11/15 21:27:28 INFO SparkPipelineRunner: Job test_windowing_1573853237.56_ee1d2b4b-1dfa-4200-9249-8f8c35c93389 finished.
19/11/15 21:27:28 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/15 21:27:28 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktest1fpj1E/job_38ace3b9-1b83-4d9b-a029-43f964b076a9/MANIFEST has 0 artifact locations
19/11/15 21:27:29 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktest1fpj1E/job_38ace3b9-1b83-4d9b-a029-43f964b076a9/
INFO:root:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 229, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 419, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 326, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
==================== Timed out after 60 seconds. ====================
    self.run().wait_until_finish()

  File "apache_beam/runners/portability/portable_runner.py", line 419, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 140173112760064)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-119, started daemon 140173121152768)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 140173908776704)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(wait_until_finish_read, started daemon 140172624652032)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-125, started daemon 140172633044736)>

# Thread: <Thread(Thread-119, started daemon 140173121152768)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 497, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(lis# Thread: <_MainThread(MainThread, started 140173908776704)>

t(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 140173112760064)>
  File "apache_beam/runners/portability/portable_runner.py", line 429, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1573853221.01_bc98c626-8347-4ac0-921e-ef12a27830e3 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 334.871s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 6s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/zbw3g6zznxad2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1546

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1546/display/redirect?page=changes>

Changes:

[kirillkozlov] Filter push-down for BigQuery (kind of) working.

[kirillkozlov] Added IT test for BigQuery. spotlesApply.

[kirillkozlov] review comments


------------------------------------------
[...truncated 1.66 MB...]
19/11/15 20:56:59 INFO main: Logging handler created.
19/11/15 20:56:59 INFO start: Status HTTP server running at localhost:42195
19/11/15 20:56:59 INFO main: semi_persistent_directory: /tmp
19/11/15 20:56:59 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 20:56:59 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573851416.44_51f9e587-e649-4faa-b544-e5af8edf9bd7', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 20:56:59 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573851416.44', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35777'}
19/11/15 20:56:59 INFO __init__: Creating state cache with size 0
19/11/15 20:56:59 INFO __init__: Creating insecure control channel for localhost:34729.
19/11/15 20:56:59 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/15 20:56:59 INFO __init__: Control channel established.
19/11/15 20:56:59 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/15 20:56:59 INFO create_state_handler: Creating insecure state channel for localhost:44575.
19/11/15 20:56:59 INFO create_state_handler: State channel established.
19/11/15 20:56:59 INFO create_data_channel: Creating client data channel for localhost:46559
19/11/15 20:56:59 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 20:56:59 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/15 20:56:59 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
19/11/15 20:56:59 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 20:56:59 INFO run: No more requests from control plane
19/11/15 20:56:59 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 20:56:59 INFO close: Closing all cached grpc data channels.
19/11/15 20:56:59 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 20:56:59 INFO close: Closing all cached gRPC state handlers.
19/11/15 20:56:59 INFO run: Done consuming work.
19/11/15 20:56:59 INFO main: Python sdk harness exiting.
19/11/15 20:56:59 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 20:56:59 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 20:56:59 INFO Executor: Finished task 0.0 in stage 131.0 (TID 158). 12763 bytes result sent to driver
19/11/15 20:56:59 INFO TaskSetManager: Finished task 0.0 in stage 131.0 (TID 158) in 1110 ms on localhost (executor driver) (1/1)
19/11/15 20:56:59 INFO TaskSchedulerImpl: Removed TaskSet 131.0, whose tasks have all completed, from pool 
19/11/15 20:56:59 INFO DAGScheduler: ShuffleMapStage 131 (mapToPair at GroupCombineFunctions.java:55) finished in 1.117 s
19/11/15 20:56:59 INFO DAGScheduler: looking for newly runnable stages
19/11/15 20:56:59 INFO DAGScheduler: running: Set()
19/11/15 20:56:59 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/15 20:56:59 INFO DAGScheduler: failed: Set()
19/11/15 20:56:59 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/15 20:56:59 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/15 20:56:59 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.9 KB, free 13.5 GB)
19/11/15 20:56:59 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:36559 (size: 22.9 KB, free: 13.5 GB)
19/11/15 20:56:59 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/15 20:56:59 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/15 20:56:59 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/15 20:56:59 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 159, localhost, executor driver, partition 1, NODE_LOCAL, 7760 bytes)
19/11/15 20:56:59 INFO Executor: Running task 1.0 in stage 132.0 (TID 159)
19/11/15 20:56:59 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/15 20:56:59 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/15 20:56:59 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestvSVzhp/job_e54b0a12-2556-437a-8d54-2e8093e7ed57/MANIFEST
19/11/15 20:56:59 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestvSVzhp/job_e54b0a12-2556-437a-8d54-2e8093e7ed57/MANIFEST -> 0 artifacts
19/11/15 20:57:00 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 20:57:00 INFO main: Logging handler created.
19/11/15 20:57:00 INFO start: Status HTTP server running at localhost:46693
19/11/15 20:57:00 INFO main: semi_persistent_directory: /tmp
19/11/15 20:57:00 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 20:57:00 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573851416.44_51f9e587-e649-4faa-b544-e5af8edf9bd7', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 20:57:00 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573851416.44', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35777'}
19/11/15 20:57:00 INFO __init__: Creating state cache with size 0
19/11/15 20:57:00 INFO __init__: Creating insecure control channel for localhost:38925.
19/11/15 20:57:00 INFO __init__: Control channel established.
19/11/15 20:57:00 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/15 20:57:00 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/15 20:57:00 INFO create_state_handler: Creating insecure state channel for localhost:42063.
19/11/15 20:57:00 INFO create_state_handler: State channel established.
19/11/15 20:57:00 INFO create_data_channel: Creating client data channel for localhost:43869
19/11/15 20:57:00 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 20:57:00 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 20:57:00 INFO run: No more requests from control plane
19/11/15 20:57:00 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 20:57:00 INFO close: Closing all cached grpc data channels.
19/11/15 20:57:00 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 20:57:00 INFO close: Closing all cached gRPC state handlers.
19/11/15 20:57:00 INFO run: Done consuming work.
19/11/15 20:57:00 INFO main: Python sdk harness exiting.
19/11/15 20:57:00 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 20:57:00 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 20:57:00 INFO Executor: Finished task 1.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/15 20:57:00 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 160, localhost, executor driver, partition 0, PROCESS_LOCAL, 7977 bytes)
19/11/15 20:57:00 INFO Executor: Running task 0.0 in stage 132.0 (TID 160)
19/11/15 20:57:00 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 159) in 1084 ms on localhost (executor driver) (1/2)
19/11/15 20:57:00 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestvSVzhp/job_e54b0a12-2556-437a-8d54-2e8093e7ed57/MANIFEST
19/11/15 20:57:00 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestvSVzhp/job_e54b0a12-2556-437a-8d54-2e8093e7ed57/MANIFEST -> 0 artifacts
19/11/15 20:57:01 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 20:57:01 INFO main: Logging handler created.
19/11/15 20:57:01 INFO start: Status HTTP server running at localhost:34635
19/11/15 20:57:01 INFO main: semi_persistent_directory: /tmp
19/11/15 20:57:01 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 20:57:01 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573851416.44_51f9e587-e649-4faa-b544-e5af8edf9bd7', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 20:57:01 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573851416.44', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35777'}
19/11/15 20:57:01 INFO __init__: Creating state cache with size 0
19/11/15 20:57:01 INFO __init__: Creating insecure control channel for localhost:41111.
19/11/15 20:57:01 INFO __init__: Control channel established.
19/11/15 20:57:01 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/15 20:57:01 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/15 20:57:01 INFO create_state_handler: Creating insecure state channel for localhost:41617.
19/11/15 20:57:01 INFO create_state_handler: State channel established.
19/11/15 20:57:01 INFO create_data_channel: Creating client data channel for localhost:39985
19/11/15 20:57:01 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 20:57:01 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 20:57:01 INFO run: No more requests from control plane
19/11/15 20:57:01 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 20:57:01 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 20:57:01 INFO close: Closing all cached grpc data channels.
19/11/15 20:57:01 INFO close: Closing all cached gRPC state handlers.
19/11/15 20:57:01 INFO run: Done consuming work.
19/11/15 20:57:01 INFO main: Python sdk harness exiting.
19/11/15 20:57:01 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 20:57:01 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 20:57:01 INFO Executor: Finished task 0.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/15 20:57:01 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 160) in 994 ms on localhost (executor driver) (2/2)
19/11/15 20:57:01 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/15 20:57:01 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 2.084 s
19/11/15 20:57:01 INFO DAGScheduler: looking for newly runnable stages
19/11/15 20:57:01 INFO DAGScheduler: running: Set()
19/11/15 20:57:01 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/15 20:57:01 INFO DAGScheduler: failed: Set()
19/11/15 20:57:01 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/15 20:57:01 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/15 20:57:01 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.3 KB, free 13.5 GB)
19/11/15 20:57:01 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:36559 (size: 12.3 KB, free: 13.5 GB)
19/11/15 20:57:01 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/15 20:57:01 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/15 20:57:01 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/15 20:57:01 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/15 20:57:01 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/15 20:57:01 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/15 20:57:01 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
19/11/15 20:57:01 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestvSVzhp/job_e54b0a12-2556-437a-8d54-2e8093e7ed57/MANIFEST
19/11/15 20:57:01 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestvSVzhp/job_e54b0a12-2556-437a-8d54-2e8093e7ed57/MANIFEST -> 0 artifacts
19/11/15 20:57:01 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 20:57:01 INFO main: Logging handler created.
19/11/15 20:57:01 INFO start: Status HTTP server running at localhost:46637
19/11/15 20:57:01 INFO main: semi_persistent_directory: /tmp
19/11/15 20:57:01 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 20:57:01 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573851416.44_51f9e587-e649-4faa-b544-e5af8edf9bd7', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 20:57:01 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573851416.44', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35777'}
19/11/15 20:57:01 INFO __init__: Creating state cache with size 0
19/11/15 20:57:01 INFO __init__: Creating insecure control channel for localhost:34559.
19/11/15 20:57:02 INFO __init__: Control channel established.
19/11/15 20:57:02 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/15 20:57:02 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/15 20:57:02 INFO create_state_handler: Creating insecure state channel for localhost:44037.
19/11/15 20:57:02 INFO create_state_handler: State channel established.
19/11/15 20:57:02 INFO create_data_channel: Creating client data channel for localhost:42625
19/11/15 20:57:02 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 20:57:02 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 20:57:02 INFO run: No more requests from control plane
19/11/15 20:57:02 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 20:57:02 INFO close: Closing all cached grpc data channels.
19/11/15 20:57:02 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 20:57:02 INFO close: Closing all cached gRPC state handlers.
19/11/15 20:57:02 INFO run: Done consuming work.
19/11/15 20:57:02 INFO main: Python sdk harness exiting.
19/11/15 20:57:02 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 20:57:02 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 20:57:02 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/15 20:57:02 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 781 ms on localhost (executor driver) (1/1)
19/11/15 20:57:02 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/15 20:57:02 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.786 s
19/11/15 20:57:02 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.753434 s
19/11/15 20:57:02 INFO SparkPipelineRunner: Job test_windowing_1573851416.44_51f9e587-e649-4faa-b544-e5af8edf9bd7 finished.
19/11/15 20:57:02 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/15 20:57:02 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestvSVzhp/job_e54b0a12-2556-437a-8d54-2e8093e7ed57/MANIFEST has 0 artifact locations
19/11/15 20:57:02 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestvSVzhp/job_e54b0a12-2556-437a-8d54-2e8093e7ed57/
INFO:root:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 229, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 419, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 497, in test_sdf_with_watermark_tracking
==================== Timed out after 60 seconds. ====================

    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
# Thread: <Thread(wait_until_finish_read, started daemon 139827542480640)>

    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 429, in wait_until_finish
# Thread: <Thread(Thread-116, started daemon 139827559266048)>

    self._job_id, self._state, self._last_error_message()))
# Thread: <_MainThread(MainThread, started 139828411627264)>
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1573851407.85_106b2a91-bc07-40cc-a95c-3a0c510933a8 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 278.031s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 10s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/udrgi2mctokmw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1545

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1545/display/redirect>

Changes:


------------------------------------------
[...truncated 1.66 MB...]
19/11/15 18:57:55 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/15 18:57:55 INFO DAGScheduler: failed: Set()
19/11/15 18:57:55 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/15 18:57:55 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/15 18:57:55 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.9 KB, free 13.5 GB)
19/11/15 18:57:55 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:40023 (size: 22.9 KB, free: 13.5 GB)
19/11/15 18:57:55 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/15 18:57:55 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/15 18:57:55 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/15 18:57:55 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 159, localhost, executor driver, partition 1, NODE_LOCAL, 7760 bytes)
19/11/15 18:57:55 INFO Executor: Running task 1.0 in stage 132.0 (TID 159)
19/11/15 18:57:55 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/15 18:57:55 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
19/11/15 18:57:55 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestWRReuM/job_ee4436b5-ecb9-4d3e-99be-3389ed3d4b96/MANIFEST
19/11/15 18:57:55 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestWRReuM/job_ee4436b5-ecb9-4d3e-99be-3389ed3d4b96/MANIFEST -> 0 artifacts
19/11/15 18:57:56 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 18:57:56 INFO main: Logging handler created.
19/11/15 18:57:56 INFO start: Status HTTP server running at localhost:38459
19/11/15 18:57:56 INFO main: semi_persistent_directory: /tmp
19/11/15 18:57:56 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 18:57:56 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573844272.98_c3b0a123-665a-4c28-bf15-f0a831b1afdb', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 18:57:56 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573844272.98', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59235'}
19/11/15 18:57:56 INFO __init__: Creating state cache with size 0
19/11/15 18:57:56 INFO __init__: Creating insecure control channel for localhost:43009.
19/11/15 18:57:56 INFO __init__: Control channel established.
19/11/15 18:57:56 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/15 18:57:56 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/15 18:57:56 INFO create_state_handler: Creating insecure state channel for localhost:44373.
19/11/15 18:57:56 INFO create_state_handler: State channel established.
19/11/15 18:57:56 INFO create_data_channel: Creating client data channel for localhost:43461
19/11/15 18:57:56 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 18:57:56 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 18:57:56 INFO run: No more requests from control plane
19/11/15 18:57:56 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 18:57:56 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 18:57:56 INFO close: Closing all cached grpc data channels.
19/11/15 18:57:56 INFO close: Closing all cached gRPC state handlers.
19/11/15 18:57:56 INFO run: Done consuming work.
19/11/15 18:57:56 INFO main: Python sdk harness exiting.
19/11/15 18:57:56 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 18:57:56 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 18:57:56 INFO Executor: Finished task 1.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/15 18:57:56 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 160, localhost, executor driver, partition 0, PROCESS_LOCAL, 7977 bytes)
19/11/15 18:57:56 INFO Executor: Running task 0.0 in stage 132.0 (TID 160)
19/11/15 18:57:56 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 159) in 952 ms on localhost (executor driver) (1/2)
19/11/15 18:57:56 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestWRReuM/job_ee4436b5-ecb9-4d3e-99be-3389ed3d4b96/MANIFEST
19/11/15 18:57:56 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestWRReuM/job_ee4436b5-ecb9-4d3e-99be-3389ed3d4b96/MANIFEST -> 0 artifacts
19/11/15 18:57:57 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 18:57:57 INFO main: Logging handler created.
19/11/15 18:57:57 INFO start: Status HTTP server running at localhost:39517
19/11/15 18:57:57 INFO main: semi_persistent_directory: /tmp
19/11/15 18:57:57 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 18:57:57 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573844272.98_c3b0a123-665a-4c28-bf15-f0a831b1afdb', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 18:57:57 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573844272.98', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59235'}
19/11/15 18:57:57 INFO __init__: Creating state cache with size 0
19/11/15 18:57:57 INFO __init__: Creating insecure control channel for localhost:40053.
19/11/15 18:57:57 INFO __init__: Control channel established.
19/11/15 18:57:57 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/15 18:57:57 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/15 18:57:57 INFO create_state_handler: Creating insecure state channel for localhost:38365.
19/11/15 18:57:57 INFO create_state_handler: State channel established.
19/11/15 18:57:57 INFO create_data_channel: Creating client data channel for localhost:39963
19/11/15 18:57:57 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 18:57:57 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 18:57:57 INFO run: No more requests from control plane
19/11/15 18:57:57 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 18:57:57 INFO close: Closing all cached grpc data channels.
19/11/15 18:57:57 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 18:57:57 INFO close: Closing all cached gRPC state handlers.
19/11/15 18:57:57 INFO run: Done consuming work.
19/11/15 18:57:57 INFO main: Python sdk harness exiting.
19/11/15 18:57:57 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 18:57:57 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 18:57:57 INFO Executor: Finished task 0.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/15 18:57:57 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 160) in 980 ms on localhost (executor driver) (2/2)
19/11/15 18:57:57 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/15 18:57:57 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.940 s
19/11/15 18:57:57 INFO DAGScheduler: looking for newly runnable stages
19/11/15 18:57:57 INFO DAGScheduler: running: Set()
19/11/15 18:57:57 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/15 18:57:57 INFO DAGScheduler: failed: Set()
19/11/15 18:57:57 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/15 18:57:57 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/15 18:57:57 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.4 KB, free 13.5 GB)
19/11/15 18:57:57 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:40023 (size: 12.4 KB, free: 13.5 GB)
19/11/15 18:57:57 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/15 18:57:57 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/15 18:57:57 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/15 18:57:57 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/15 18:57:57 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/15 18:57:57 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/15 18:57:57 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/15 18:57:57 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestWRReuM/job_ee4436b5-ecb9-4d3e-99be-3389ed3d4b96/MANIFEST
19/11/15 18:57:57 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestWRReuM/job_ee4436b5-ecb9-4d3e-99be-3389ed3d4b96/MANIFEST -> 0 artifacts
19/11/15 18:57:58 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 18:57:58 INFO main: Logging handler created.
19/11/15 18:57:58 INFO start: Status HTTP server running at localhost:33079
19/11/15 18:57:58 INFO main: semi_persistent_directory: /tmp
19/11/15 18:57:58 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 18:57:58 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573844272.98_c3b0a123-665a-4c28-bf15-f0a831b1afdb', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 18:57:58 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573844272.98', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59235'}
19/11/15 18:57:58 INFO __init__: Creating state cache with size 0
19/11/15 18:57:58 INFO __init__: Creating insecure control channel for localhost:36289.
19/11/15 18:57:58 INFO __init__: Control channel established.
19/11/15 18:57:58 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/15 18:57:58 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/15 18:57:58 INFO create_state_handler: Creating insecure state channel for localhost:34045.
19/11/15 18:57:58 INFO create_state_handler: State channel established.
19/11/15 18:57:58 INFO create_data_channel: Creating client data channel for localhost:44409
19/11/15 18:57:58 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 18:57:58 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 18:57:58 INFO run: No more requests from control plane
19/11/15 18:57:58 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 18:57:58 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 18:57:58 INFO close: Closing all cached grpc data channels.
19/11/15 18:57:58 INFO close: Closing all cached gRPC state handlers.
19/11/15 18:57:58 INFO run: Done consuming work.
19/11/15 18:57:58 INFO main: Python sdk harness exiting.
19/11/15 18:57:58 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 18:57:58 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 18:57:58 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/15 18:57:58 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 961 ms on localhost (executor driver) (1/1)
19/11/15 18:57:58 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/15 18:57:58 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.970 s
19/11/15 18:57:58 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.756656 s
19/11/15 18:57:58 INFO SparkPipelineRunner: Job test_windowing_1573844272.98_c3b0a123-665a-4c28-bf15-f0a831b1afdb finished.
19/11/15 18:57:58 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/15 18:57:58 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestWRReuM/job_ee4436b5-ecb9-4d3e-99be-3389ed3d4b96/MANIFEST has 0 artifact locations
19/11/15 18:57:58 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestWRReuM/job_ee4436b5-ecb9-4d3e-99be-3389ed3d4b96/
INFO:root:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 229, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 419, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
==================== Timed out after 60 seconds. ====================
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)

# Thread: <Thread(wait_until_finish_read, started daemon 140122278323968)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(Thread-120, started daemon 140122286716672)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
# Thread: <_MainThread(MainThread, started 140123066156800)>
==================== Timed out after 60 seconds. ====================

BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 140122252621568)>

======================================================================
# Thread: <Thread(Thread-126, started daemon 140122261014272)>

ERROR: test_pardo_timers (__main__.SparkRunnerTest)
# Thread: <_MainThread(MainThread, started 140123066156800)>

----------------------------------------------------------------------
# Thread: <Thread(Thread-120, started daemon 140122286716672)>

# Thread: <Thread(wait_until_finish_read, started daemon 140122278323968)>
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 326, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 419, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 497, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 429, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1573844263.26_dad16640-af5a-4278-adac-b150c6523368 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 348.007s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 40s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/xgjnahfnnmgv6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1544

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1544/display/redirect?page=changes>

Changes:

[sunjincheng121] [BEAM-8557] Add log for the dropped unknown response

[migryz] Bump python precommit timeout

[lcwik] [BEAM-8151] Swap to create SdkWorkers on demand when processing jobs


------------------------------------------
[...truncated 1.67 MB...]
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2951
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2859
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2606
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2594
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2647
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2582
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2930
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2744
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2692
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2587
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2558
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2488
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2846
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2800
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2636
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2842
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2917
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2788
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2811
19/11/15 18:06:08 INFO BlockManagerInfo: Removed broadcast_128_piece0 on localhost:44963 in memory (size: 10.8 KB, free: 13.5 GB)
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2781
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2511
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2782
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2568
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2780
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2815
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2814
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2689
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2900
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2889
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2768
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2534
19/11/15 18:06:08 INFO BlockManagerInfo: Removed broadcast_121_piece0 on localhost:44963 in memory (size: 8.9 KB, free: 13.5 GB)
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2894
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2924
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2911
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2640
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2946
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2857
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2718
19/11/15 18:06:08 INFO BlockManagerInfo: Removed broadcast_124_piece0 on localhost:44963 in memory (size: 11.9 KB, free: 13.5 GB)
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2863
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2756
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2599
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2563
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2537
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2787
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2891
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2717
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2882
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2714
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2486
19/11/15 18:06:08 INFO ContextCleaner: Cleaned accumulator 2477
19/11/15 18:06:09 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 18:06:09 INFO main: Logging handler created.
19/11/15 18:06:09 INFO start: Status HTTP server running at localhost:35315
19/11/15 18:06:09 INFO main: semi_persistent_directory: /tmp
19/11/15 18:06:09 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 18:06:09 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573841164.86_4778714a-29be-42cd-938d-04a41f49c8a6', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 18:06:09 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573841164.86', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44715'}
19/11/15 18:06:09 INFO __init__: Creating state cache with size 0
19/11/15 18:06:09 INFO __init__: Creating insecure control channel for localhost:33737.
19/11/15 18:06:09 INFO __init__: Control channel established.
19/11/15 18:06:09 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/15 18:06:09 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/15 18:06:09 INFO create_state_handler: Creating insecure state channel for localhost:42657.
19/11/15 18:06:09 INFO create_state_handler: State channel established.
19/11/15 18:06:09 INFO create_data_channel: Creating client data channel for localhost:43139
19/11/15 18:06:09 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 18:06:09 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 18:06:09 INFO run: No more requests from control plane
19/11/15 18:06:09 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 18:06:09 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 18:06:09 INFO close: Closing all cached grpc data channels.
19/11/15 18:06:09 INFO close: Closing all cached gRPC state handlers.
19/11/15 18:06:09 INFO run: Done consuming work.
19/11/15 18:06:09 INFO main: Python sdk harness exiting.
19/11/15 18:06:09 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 18:06:09 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 18:06:09 INFO Executor: Finished task 0.0 in stage 132.0 (TID 160). 13753 bytes result sent to driver
19/11/15 18:06:09 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 160) in 879 ms on localhost (executor driver) (2/2)
19/11/15 18:06:09 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/15 18:06:09 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.764 s
19/11/15 18:06:09 INFO DAGScheduler: looking for newly runnable stages
19/11/15 18:06:09 INFO DAGScheduler: running: Set()
19/11/15 18:06:09 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/15 18:06:09 INFO DAGScheduler: failed: Set()
19/11/15 18:06:09 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/15 18:06:09 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/15 18:06:09 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.3 KB, free 13.5 GB)
19/11/15 18:06:09 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:44963 (size: 12.3 KB, free: 13.5 GB)
19/11/15 18:06:09 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/15 18:06:09 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/15 18:06:09 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/15 18:06:09 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/15 18:06:09 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/15 18:06:09 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/15 18:06:09 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/15 18:06:09 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestx7YS_F/job_e9816b93-bb0a-4999-8d39-9fb335121d7a/MANIFEST
19/11/15 18:06:09 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestx7YS_F/job_e9816b93-bb0a-4999-8d39-9fb335121d7a/MANIFEST -> 0 artifacts
19/11/15 18:06:10 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 18:06:10 INFO main: Logging handler created.
19/11/15 18:06:10 INFO start: Status HTTP server running at localhost:34057
19/11/15 18:06:10 INFO main: semi_persistent_directory: /tmp
19/11/15 18:06:10 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 18:06:10 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573841164.86_4778714a-29be-42cd-938d-04a41f49c8a6', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 18:06:10 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573841164.86', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44715'}
19/11/15 18:06:10 INFO __init__: Creating state cache with size 0
19/11/15 18:06:10 INFO __init__: Creating insecure control channel for localhost:33165.
19/11/15 18:06:10 INFO __init__: Control channel established.
19/11/15 18:06:10 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/15 18:06:10 INFO __init__: Initializing SDKHarness with unbounded number of workers.
19/11/15 18:06:10 INFO create_state_handler: Creating insecure state channel for localhost:36291.
19/11/15 18:06:10 INFO create_state_handler: State channel established.
19/11/15 18:06:10 INFO create_data_channel: Creating client data channel for localhost:46003
19/11/15 18:06:10 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 18:06:10 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 18:06:10 INFO run: No more requests from control plane
19/11/15 18:06:10 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 18:06:10 INFO close: Closing all cached grpc data channels.
19/11/15 18:06:10 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 18:06:10 INFO close: Closing all cached gRPC state handlers.
19/11/15 18:06:10 INFO run: Done consuming work.
19/11/15 18:06:10 INFO main: Python sdk harness exiting.
19/11/15 18:06:10 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 18:06:10 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 18:06:10 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/15 18:06:10 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 836 ms on localhost (executor driver) (1/1)
19/11/15 18:06:10 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/15 18:06:10 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.843 s
19/11/15 18:06:10 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.321571 s
19/11/15 18:06:10 INFO SparkPipelineRunner: Job test_windowing_1573841164.86_4778714a-29be-42cd-938d-04a41f49c8a6 finished.
19/11/15 18:06:10 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/15 18:06:10 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestx7YS_F/job_e9816b93-bb0a-4999-8d39-9fb335121d7a/MANIFEST has 0 artifact locations
19/11/15 18:06:10 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestx7YS_F/job_e9816b93-bb0a-4999-8d39-9fb335121d7a/
INFO:root:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 229, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 419, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
==================== Timed out after 60 seconds. ====================
----------------------------------------------------------------------

# Thread: <Thread(wait_until_finish_read, started daemon 140009004828416)>

Traceback (most recent call last):
# Thread: <Thread(Thread-120, started daemon 140009013221120)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 326, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_ru# Thread: <_MainThread(MainThread, started 140009799784192)>
==================== Timed out after 60 seconds. ====================

nner.py", line 419, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 140008979650304)>

# Thread: <Thread(Thread-126, started daemon 140008988043008)>

# Thread: <_MainThread(MainThread, started 140009799784192)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 497, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 429, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1573841156.08_0d288ffc-742a-4125-9e68-2f71c1cef676 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 291.696s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 6s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/la45teijpxfxk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1543

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1543/display/redirect?page=changes>

Changes:

[robertwb] Add option to test metrics on runners without gauge support.


------------------------------------------
[...truncated 1.66 MB...]
19/11/15 16:59:58 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/15 16:59:58 INFO DAGScheduler: failed: Set()
19/11/15 16:59:58 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/15 16:59:58 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/15 16:59:58 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.1 KB, free 13.5 GB)
19/11/15 16:59:58 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:42563 (size: 22.1 KB, free: 13.5 GB)
19/11/15 16:59:58 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/15 16:59:58 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/15 16:59:58 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/15 16:59:58 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 159, localhost, executor driver, partition 0, NODE_LOCAL, 7760 bytes)
19/11/15 16:59:58 INFO Executor: Running task 0.0 in stage 132.0 (TID 159)
19/11/15 16:59:58 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/15 16:59:58 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/15 16:59:58 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktesttfj9Pn/job_c9a170c2-c5ba-4e66-9e68-b3efcb456035/MANIFEST
19/11/15 16:59:58 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktesttfj9Pn/job_c9a170c2-c5ba-4e66-9e68-b3efcb456035/MANIFEST -> 0 artifacts
19/11/15 16:59:59 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 16:59:59 INFO main: Logging handler created.
19/11/15 16:59:59 INFO start: Status HTTP server running at localhost:35473
19/11/15 16:59:59 INFO main: semi_persistent_directory: /tmp
19/11/15 16:59:59 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 16:59:59 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573837194.36_519afbdc-3781-4df7-96fc-329972a1fd2a', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 16:59:59 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573837194.36', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55731'}
19/11/15 16:59:59 INFO __init__: Creating state cache with size 0
19/11/15 16:59:59 INFO __init__: Creating insecure control channel for localhost:38179.
19/11/15 16:59:59 INFO __init__: Control channel established.
19/11/15 16:59:59 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/15 16:59:59 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/15 16:59:59 INFO create_state_handler: Creating insecure state channel for localhost:45001.
19/11/15 16:59:59 INFO create_state_handler: State channel established.
19/11/15 16:59:59 INFO create_data_channel: Creating client data channel for localhost:36735
19/11/15 16:59:59 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 16:59:59 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 16:59:59 INFO run: No more requests from control plane
19/11/15 16:59:59 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 16:59:59 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 16:59:59 INFO close: Closing all cached grpc data channels.
19/11/15 16:59:59 INFO close: Closing all cached gRPC state handlers.
19/11/15 16:59:59 INFO run: Done consuming work.
19/11/15 16:59:59 INFO main: Python sdk harness exiting.
19/11/15 16:59:59 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 16:59:59 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 17:00:00 INFO Executor: Finished task 0.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/15 17:00:00 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 160, localhost, executor driver, partition 1, PROCESS_LOCAL, 7977 bytes)
19/11/15 17:00:00 INFO Executor: Running task 1.0 in stage 132.0 (TID 160)
19/11/15 17:00:00 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 159) in 1432 ms on localhost (executor driver) (1/2)
19/11/15 17:00:00 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktesttfj9Pn/job_c9a170c2-c5ba-4e66-9e68-b3efcb456035/MANIFEST
19/11/15 17:00:00 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktesttfj9Pn/job_c9a170c2-c5ba-4e66-9e68-b3efcb456035/MANIFEST -> 0 artifacts
19/11/15 17:00:01 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 17:00:01 INFO main: Logging handler created.
19/11/15 17:00:01 INFO main: semi_persistent_directory: /tmp
19/11/15 17:00:01 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 17:00:01 INFO start: Status HTTP server running at localhost:41411
19/11/15 17:00:01 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573837194.36_519afbdc-3781-4df7-96fc-329972a1fd2a', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 17:00:01 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573837194.36', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55731'}
19/11/15 17:00:01 INFO __init__: Creating state cache with size 0
19/11/15 17:00:01 INFO __init__: Creating insecure control channel for localhost:41399.
19/11/15 17:00:01 INFO __init__: Control channel established.
19/11/15 17:00:01 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/15 17:00:01 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/15 17:00:01 INFO create_state_handler: Creating insecure state channel for localhost:42939.
19/11/15 17:00:01 INFO create_state_handler: State channel established.
19/11/15 17:00:01 INFO create_data_channel: Creating client data channel for localhost:42361
19/11/15 17:00:01 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 17:00:02 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 17:00:02 INFO run: No more requests from control plane
19/11/15 17:00:02 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 17:00:02 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 17:00:02 INFO close: Closing all cached grpc data channels.
19/11/15 17:00:02 INFO close: Closing all cached gRPC state handlers.
19/11/15 17:00:02 INFO run: Done consuming work.
19/11/15 17:00:02 INFO main: Python sdk harness exiting.
19/11/15 17:00:02 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 17:00:02 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 17:00:02 INFO Executor: Finished task 1.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/15 17:00:02 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 160) in 2219 ms on localhost (executor driver) (2/2)
19/11/15 17:00:02 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/15 17:00:02 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 3.657 s
19/11/15 17:00:02 INFO DAGScheduler: looking for newly runnable stages
19/11/15 17:00:02 INFO DAGScheduler: running: Set()
19/11/15 17:00:02 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/15 17:00:02 INFO DAGScheduler: failed: Set()
19/11/15 17:00:02 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/15 17:00:02 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/15 17:00:02 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.3 KB, free 13.5 GB)
19/11/15 17:00:02 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:42563 (size: 12.3 KB, free: 13.5 GB)
19/11/15 17:00:02 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/15 17:00:02 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/15 17:00:02 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/15 17:00:02 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/15 17:00:02 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/15 17:00:02 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/15 17:00:02 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/15 17:00:02 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktesttfj9Pn/job_c9a170c2-c5ba-4e66-9e68-b3efcb456035/MANIFEST
19/11/15 17:00:02 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktesttfj9Pn/job_c9a170c2-c5ba-4e66-9e68-b3efcb456035/MANIFEST -> 0 artifacts
19/11/15 17:00:04 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 17:00:04 INFO main: Logging handler created.
19/11/15 17:00:04 INFO main: semi_persistent_directory: /tmp
19/11/15 17:00:04 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 17:00:04 INFO start: Status HTTP server running at localhost:34339
19/11/15 17:00:04 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573837194.36_519afbdc-3781-4df7-96fc-329972a1fd2a', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 17:00:04 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573837194.36', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55731'}
19/11/15 17:00:04 INFO __init__: Creating state cache with size 0
19/11/15 17:00:04 INFO __init__: Creating insecure control channel for localhost:41771.
19/11/15 17:00:04 INFO __init__: Control channel established.
19/11/15 17:00:04 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/15 17:00:04 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/15 17:00:04 INFO create_state_handler: Creating insecure state channel for localhost:41213.
19/11/15 17:00:04 INFO create_state_handler: State channel established.
19/11/15 17:00:04 INFO create_data_channel: Creating client data channel for localhost:36741
19/11/15 17:00:04 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 17:00:04 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 17:00:04 INFO run: No more requests from control plane
19/11/15 17:00:04 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 17:00:04 INFO close: Closing all cached grpc data channels.
19/11/15 17:00:04 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 17:00:04 INFO close: Closing all cached gRPC state handlers.
19/11/15 17:00:04 INFO run: Done consuming work.
19/11/15 17:00:04 INFO main: Python sdk harness exiting.
19/11/15 17:00:04 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 17:00:04 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 17:00:04 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/15 17:00:04 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 2349 ms on localhost (executor driver) (1/1)
19/11/15 17:00:04 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/15 17:00:04 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 2.358 s
19/11/15 17:00:04 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 8.450839 s
19/11/15 17:00:04 INFO SparkPipelineRunner: Job test_windowing_1573837194.36_519afbdc-3781-4df7-96fc-329972a1fd2a finished.
19/11/15 17:00:04 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/15 17:00:04 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktesttfj9Pn/job_c9a170c2-c5ba-4e66-9e68-b3efcb456035/MANIFEST has 0 artifact locations
19/11/15 17:00:04 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktesttfj9Pn/job_c9a170c2-c5ba-4e66-9e68-b3efcb456035/
INFO:root:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 229, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 420, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

==================== Timed out after 60 seconds. ====================
======================================================================

# Thread: <Thread(wait_until_finish_read, started daemon 140535777740544)>

# Thread: <Thread(Thread-120, started daemon 140535794525952)>

# Thread: <_MainThread(MainThread, started 140536573757184)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140535155128064)>

# Thread: <Thread(Thread-126, started daemon 140535768561408)>

ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <_MainThread(MainThread, started 140536573757184)>

# Thread: <Thread(wait_until_finish_read, started daemon 140535777740544)>

# Thread: <Thread(Thread-120, started daemon 140535794525952)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 326, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 420, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 497, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 430, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1573837180.94_7130dbfe-5c4d-456a-8182-3685e88c9cf9 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 343.213s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 33s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/txopgqx5eqdti

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1542

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1542/display/redirect?page=changes>

Changes:

[robertwb] [BEAM-8655] Strengthen trigger transcript test to use multiple keys.

[robertwb] [BEAM-8655] Run trigger transcript tests with combiner as well as

[robertwb] [BEAM-8655] Run the subset of trigger tests that make sense in batch


------------------------------------------
[...truncated 1.66 MB...]
19/11/15 16:38:05 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/15 16:38:05 INFO DAGScheduler: failed: Set()
19/11/15 16:38:05 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/15 16:38:05 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/15 16:38:05 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.1 KB, free 13.5 GB)
19/11/15 16:38:05 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:46553 (size: 22.1 KB, free: 13.5 GB)
19/11/15 16:38:05 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/15 16:38:05 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/15 16:38:05 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/15 16:38:05 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 159, localhost, executor driver, partition 0, NODE_LOCAL, 7760 bytes)
19/11/15 16:38:05 INFO Executor: Running task 0.0 in stage 132.0 (TID 159)
19/11/15 16:38:05 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/15 16:38:05 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/15 16:38:05 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestKxy5sf/job_e832f4e5-1d03-4bfb-a3e4-07d249c64e31/MANIFEST
19/11/15 16:38:05 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestKxy5sf/job_e832f4e5-1d03-4bfb-a3e4-07d249c64e31/MANIFEST -> 0 artifacts
19/11/15 16:38:06 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 16:38:06 INFO main: Logging handler created.
19/11/15 16:38:06 INFO start: Status HTTP server running at localhost:36049
19/11/15 16:38:06 INFO main: semi_persistent_directory: /tmp
19/11/15 16:38:06 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 16:38:06 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573835882.72_787492a0-4ce8-480f-b3d7-6bb4175daaa0', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 16:38:06 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573835882.72', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41655'}
19/11/15 16:38:06 INFO __init__: Creating state cache with size 0
19/11/15 16:38:06 INFO __init__: Creating insecure control channel for localhost:44585.
19/11/15 16:38:06 INFO __init__: Control channel established.
19/11/15 16:38:06 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/15 16:38:06 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/15 16:38:06 INFO create_state_handler: Creating insecure state channel for localhost:42155.
19/11/15 16:38:06 INFO create_state_handler: State channel established.
19/11/15 16:38:06 INFO create_data_channel: Creating client data channel for localhost:38015
19/11/15 16:38:06 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 16:38:06 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 16:38:06 INFO run: No more requests from control plane
19/11/15 16:38:06 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 16:38:06 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 16:38:06 INFO close: Closing all cached grpc data channels.
19/11/15 16:38:06 INFO close: Closing all cached gRPC state handlers.
19/11/15 16:38:06 INFO run: Done consuming work.
19/11/15 16:38:06 INFO main: Python sdk harness exiting.
19/11/15 16:38:06 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 16:38:06 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 16:38:06 INFO Executor: Finished task 0.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/15 16:38:06 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 160, localhost, executor driver, partition 1, PROCESS_LOCAL, 7977 bytes)
19/11/15 16:38:06 INFO Executor: Running task 1.0 in stage 132.0 (TID 160)
19/11/15 16:38:06 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 159) in 875 ms on localhost (executor driver) (1/2)
19/11/15 16:38:06 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestKxy5sf/job_e832f4e5-1d03-4bfb-a3e4-07d249c64e31/MANIFEST
19/11/15 16:38:06 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestKxy5sf/job_e832f4e5-1d03-4bfb-a3e4-07d249c64e31/MANIFEST -> 0 artifacts
19/11/15 16:38:06 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 16:38:06 INFO main: Logging handler created.
19/11/15 16:38:06 INFO start: Status HTTP server running at localhost:46745
19/11/15 16:38:06 INFO main: semi_persistent_directory: /tmp
19/11/15 16:38:06 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 16:38:06 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573835882.72_787492a0-4ce8-480f-b3d7-6bb4175daaa0', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 16:38:06 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573835882.72', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41655'}
19/11/15 16:38:06 INFO __init__: Creating state cache with size 0
19/11/15 16:38:06 INFO __init__: Creating insecure control channel for localhost:36823.
19/11/15 16:38:06 INFO __init__: Control channel established.
19/11/15 16:38:06 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/15 16:38:06 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/15 16:38:06 INFO create_state_handler: Creating insecure state channel for localhost:39397.
19/11/15 16:38:06 INFO create_state_handler: State channel established.
19/11/15 16:38:06 INFO create_data_channel: Creating client data channel for localhost:35505
19/11/15 16:38:06 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 16:38:06 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 16:38:07 INFO run: No more requests from control plane
19/11/15 16:38:07 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 16:38:07 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 16:38:07 INFO close: Closing all cached grpc data channels.
19/11/15 16:38:07 INFO close: Closing all cached gRPC state handlers.
19/11/15 16:38:07 INFO run: Done consuming work.
19/11/15 16:38:07 INFO main: Python sdk harness exiting.
19/11/15 16:38:07 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 16:38:07 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 16:38:07 INFO Executor: Finished task 1.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/15 16:38:07 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 160) in 821 ms on localhost (executor driver) (2/2)
19/11/15 16:38:07 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/15 16:38:07 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.701 s
19/11/15 16:38:07 INFO DAGScheduler: looking for newly runnable stages
19/11/15 16:38:07 INFO DAGScheduler: running: Set()
19/11/15 16:38:07 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/15 16:38:07 INFO DAGScheduler: failed: Set()
19/11/15 16:38:07 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/15 16:38:07 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/15 16:38:07 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.3 KB, free 13.5 GB)
19/11/15 16:38:07 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:46553 (size: 12.3 KB, free: 13.5 GB)
19/11/15 16:38:07 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/15 16:38:07 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/15 16:38:07 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/15 16:38:07 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/15 16:38:07 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/15 16:38:07 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/15 16:38:07 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/15 16:38:07 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestKxy5sf/job_e832f4e5-1d03-4bfb-a3e4-07d249c64e31/MANIFEST
19/11/15 16:38:07 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestKxy5sf/job_e832f4e5-1d03-4bfb-a3e4-07d249c64e31/MANIFEST -> 0 artifacts
19/11/15 16:38:07 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 16:38:07 INFO main: Logging handler created.
19/11/15 16:38:07 INFO start: Status HTTP server running at localhost:33609
19/11/15 16:38:07 INFO main: semi_persistent_directory: /tmp
19/11/15 16:38:07 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 16:38:07 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573835882.72_787492a0-4ce8-480f-b3d7-6bb4175daaa0', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 16:38:07 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573835882.72', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41655'}
19/11/15 16:38:07 INFO __init__: Creating state cache with size 0
19/11/15 16:38:07 INFO __init__: Creating insecure control channel for localhost:33677.
19/11/15 16:38:07 INFO __init__: Control channel established.
19/11/15 16:38:07 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/15 16:38:07 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/15 16:38:07 INFO create_state_handler: Creating insecure state channel for localhost:41691.
19/11/15 16:38:07 INFO create_state_handler: State channel established.
19/11/15 16:38:07 INFO create_data_channel: Creating client data channel for localhost:36413
19/11/15 16:38:07 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 16:38:07 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 16:38:07 INFO run: No more requests from control plane
19/11/15 16:38:07 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 16:38:07 INFO close: Closing all cached grpc data channels.
19/11/15 16:38:07 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 16:38:07 INFO close: Closing all cached gRPC state handlers.
19/11/15 16:38:07 INFO run: Done consuming work.
19/11/15 16:38:07 INFO main: Python sdk harness exiting.
19/11/15 16:38:07 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 16:38:07 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 16:38:07 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/15 16:38:07 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 819 ms on localhost (executor driver) (1/1)
19/11/15 16:38:07 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/15 16:38:07 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.824 s
19/11/15 16:38:07 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.176904 s
19/11/15 16:38:07 INFO SparkPipelineRunner: Job test_windowing_1573835882.72_787492a0-4ce8-480f-b3d7-6bb4175daaa0 finished.
19/11/15 16:38:07 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/15 16:38:07 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestKxy5sf/job_e832f4e5-1d03-4bfb-a3e4-07d249c64e31/MANIFEST has 0 artifact locations
19/11/15 16:38:07 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestKxy5sf/job_e832f4e5-1d03-4bfb-a3e4-07d249c64e31/
INFO:root:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 229, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 420, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139845477332736)>

Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 326, in test_pardo_timers
# Thread: <Thread(Thread-119, started daemon 139845564086016)>

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 420, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 139846343317248)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 139845468940032)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-125, started daemon 139845460547328)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 139846343317248)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-119, started daemon 139845564086016)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 139845477332736)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 497, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 430, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1573835873.61_bd75f951-0a90-4447-bc82-c797b1bfbf98 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 301.474s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 34s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/6r7uokzkqars4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1541

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1541/display/redirect>

Changes:


------------------------------------------
[...truncated 1.64 MB...]
19/11/15 12:11:25 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/15 12:11:25 INFO DAGScheduler: failed: Set()
19/11/15 12:11:25 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/15 12:11:25 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/15 12:11:25 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.9 KB, free 13.5 GB)
19/11/15 12:11:25 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:39989 (size: 22.9 KB, free: 13.5 GB)
19/11/15 12:11:25 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/15 12:11:25 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/15 12:11:25 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/15 12:11:25 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 159, localhost, executor driver, partition 1, NODE_LOCAL, 7760 bytes)
19/11/15 12:11:25 INFO Executor: Running task 1.0 in stage 132.0 (TID 159)
19/11/15 12:11:25 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/15 12:11:25 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/15 12:11:25 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest1buxLU/job_f54a29d4-ac86-4096-bb76-eb405ce98293/MANIFEST
19/11/15 12:11:25 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest1buxLU/job_f54a29d4-ac86-4096-bb76-eb405ce98293/MANIFEST -> 0 artifacts
19/11/15 12:11:26 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 12:11:26 INFO main: Logging handler created.
19/11/15 12:11:26 INFO start: Status HTTP server running at localhost:36015
19/11/15 12:11:26 INFO main: semi_persistent_directory: /tmp
19/11/15 12:11:26 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 12:11:26 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573819883.16_dd968cc0-de6f-4b59-b1d8-3edf70905b23', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 12:11:26 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573819883.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46665'}
19/11/15 12:11:26 INFO __init__: Creating state cache with size 0
19/11/15 12:11:26 INFO __init__: Creating insecure control channel for localhost:36453.
19/11/15 12:11:26 INFO __init__: Control channel established.
19/11/15 12:11:26 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/15 12:11:26 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/15 12:11:26 INFO create_state_handler: Creating insecure state channel for localhost:34465.
19/11/15 12:11:26 INFO create_state_handler: State channel established.
19/11/15 12:11:26 INFO create_data_channel: Creating client data channel for localhost:33471
19/11/15 12:11:26 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 12:11:26 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 12:11:26 INFO run: No more requests from control plane
19/11/15 12:11:26 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 12:11:26 INFO close: Closing all cached grpc data channels.
19/11/15 12:11:26 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 12:11:26 INFO close: Closing all cached gRPC state handlers.
19/11/15 12:11:26 INFO run: Done consuming work.
19/11/15 12:11:26 INFO main: Python sdk harness exiting.
19/11/15 12:11:26 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 12:11:26 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 12:11:26 INFO Executor: Finished task 1.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/15 12:11:26 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 160, localhost, executor driver, partition 0, PROCESS_LOCAL, 7977 bytes)
19/11/15 12:11:26 INFO Executor: Running task 0.0 in stage 132.0 (TID 160)
19/11/15 12:11:26 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 159) in 859 ms on localhost (executor driver) (1/2)
19/11/15 12:11:26 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest1buxLU/job_f54a29d4-ac86-4096-bb76-eb405ce98293/MANIFEST
19/11/15 12:11:26 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest1buxLU/job_f54a29d4-ac86-4096-bb76-eb405ce98293/MANIFEST -> 0 artifacts
19/11/15 12:11:27 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 12:11:27 INFO main: Logging handler created.
19/11/15 12:11:27 INFO start: Status HTTP server running at localhost:46751
19/11/15 12:11:27 INFO main: semi_persistent_directory: /tmp
19/11/15 12:11:27 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 12:11:27 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573819883.16_dd968cc0-de6f-4b59-b1d8-3edf70905b23', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 12:11:27 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573819883.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46665'}
19/11/15 12:11:27 INFO __init__: Creating state cache with size 0
19/11/15 12:11:27 INFO __init__: Creating insecure control channel for localhost:45363.
19/11/15 12:11:27 INFO __init__: Control channel established.
19/11/15 12:11:27 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/15 12:11:27 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/15 12:11:27 INFO create_state_handler: Creating insecure state channel for localhost:44405.
19/11/15 12:11:27 INFO create_state_handler: State channel established.
19/11/15 12:11:27 INFO create_data_channel: Creating client data channel for localhost:33613
19/11/15 12:11:27 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 12:11:27 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 12:11:27 INFO run: No more requests from control plane
19/11/15 12:11:27 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 12:11:27 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 12:11:27 INFO close: Closing all cached grpc data channels.
19/11/15 12:11:27 INFO close: Closing all cached gRPC state handlers.
19/11/15 12:11:27 INFO run: Done consuming work.
19/11/15 12:11:27 INFO main: Python sdk harness exiting.
19/11/15 12:11:27 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 12:11:27 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 12:11:27 INFO Executor: Finished task 0.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/15 12:11:27 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 160) in 769 ms on localhost (executor driver) (2/2)
19/11/15 12:11:27 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/15 12:11:27 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.632 s
19/11/15 12:11:27 INFO DAGScheduler: looking for newly runnable stages
19/11/15 12:11:27 INFO DAGScheduler: running: Set()
19/11/15 12:11:27 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/15 12:11:27 INFO DAGScheduler: failed: Set()
19/11/15 12:11:27 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/15 12:11:27 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/15 12:11:27 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.3 KB, free 13.5 GB)
19/11/15 12:11:27 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:39989 (size: 12.3 KB, free: 13.5 GB)
19/11/15 12:11:27 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/15 12:11:27 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/15 12:11:27 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/15 12:11:27 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/15 12:11:27 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/15 12:11:27 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/15 12:11:27 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
19/11/15 12:11:27 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest1buxLU/job_f54a29d4-ac86-4096-bb76-eb405ce98293/MANIFEST
19/11/15 12:11:27 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest1buxLU/job_f54a29d4-ac86-4096-bb76-eb405ce98293/MANIFEST -> 0 artifacts
19/11/15 12:11:28 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 12:11:28 INFO main: Logging handler created.
19/11/15 12:11:28 INFO start: Status HTTP server running at localhost:43955
19/11/15 12:11:28 INFO main: semi_persistent_directory: /tmp
19/11/15 12:11:28 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 12:11:28 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573819883.16_dd968cc0-de6f-4b59-b1d8-3edf70905b23', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 12:11:28 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573819883.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46665'}
19/11/15 12:11:28 INFO __init__: Creating state cache with size 0
19/11/15 12:11:28 INFO __init__: Creating insecure control channel for localhost:32791.
19/11/15 12:11:28 INFO __init__: Control channel established.
19/11/15 12:11:28 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/15 12:11:28 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/15 12:11:28 INFO create_state_handler: Creating insecure state channel for localhost:44557.
19/11/15 12:11:28 INFO create_state_handler: State channel established.
19/11/15 12:11:28 INFO create_data_channel: Creating client data channel for localhost:39925
19/11/15 12:11:28 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 12:11:28 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 12:11:28 INFO run: No more requests from control plane
19/11/15 12:11:28 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 12:11:28 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 12:11:28 INFO close: Closing all cached grpc data channels.
19/11/15 12:11:28 INFO close: Closing all cached gRPC state handlers.
19/11/15 12:11:28 INFO run: Done consuming work.
19/11/15 12:11:28 INFO main: Python sdk harness exiting.
19/11/15 12:11:28 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 12:11:28 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 12:11:28 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/15 12:11:28 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 947 ms on localhost (executor driver) (1/1)
19/11/15 12:11:28 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/15 12:11:28 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.952 s
19/11/15 12:11:28 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.147835 s
19/11/15 12:11:28 INFO SparkPipelineRunner: Job test_windowing_1573819883.16_dd968cc0-de6f-4b59-b1d8-3edf70905b23 finished.
19/11/15 12:11:28 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/15 12:11:28 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktest1buxLU/job_f54a29d4-ac86-4096-bb76-eb405ce98293/MANIFEST has 0 artifact locations
19/11/15 12:11:28 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktest1buxLU/job_f54a29d4-ac86-4096-bb76-eb405ce98293/
INFO:root:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 229, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 420, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
==================== Timed out after 60 seconds. ====================

    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 140062325094144)>

# Thread: <Thread(Thread-118, started daemon 140062316701440)>

======================================================================# Thread: <_MainThread(MainThread, started 140063112926976)>
==================== Timed out after 60 seconds. ====================


ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140061827917568)>

# Thread: <Thread(Thread-122, started daemon 140061836310272)>

# Thread: <_MainThread(MainThread, started 140063112926976)>

# Thread: <Thread(Thread-118, started daemon 140062316701440)>

# Thread: <Thread(wait_until_finish_read, started daemon 140062325094144)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 326, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 420, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 497, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 430, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1573819874.62_4d9bda4b-b3d4-4e00-84d6-5cfd76188ef7 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 298.156s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 47s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/qcdsixyemlm7e

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1540

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1540/display/redirect?page=changes>

Changes:

[skoonce] [BEAM-8554] Use WorkItemCommitRequest protobuf fields to signal that a


------------------------------------------
[...truncated 1.65 MB...]
19/11/15 07:19:30 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/15 07:19:30 INFO DAGScheduler: failed: Set()
19/11/15 07:19:30 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/15 07:19:30 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/15 07:19:30 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.9 KB, free 13.5 GB)
19/11/15 07:19:30 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:43505 (size: 22.9 KB, free: 13.5 GB)
19/11/15 07:19:30 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/15 07:19:30 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/15 07:19:30 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/15 07:19:30 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 159, localhost, executor driver, partition 1, NODE_LOCAL, 7760 bytes)
19/11/15 07:19:30 INFO Executor: Running task 1.0 in stage 132.0 (TID 159)
19/11/15 07:19:30 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/15 07:19:30 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/15 07:19:30 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest_zW7vs/job_f60a38f8-496f-41bc-b6ad-2b4ee49df307/MANIFEST
19/11/15 07:19:30 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest_zW7vs/job_f60a38f8-496f-41bc-b6ad-2b4ee49df307/MANIFEST -> 0 artifacts
19/11/15 07:19:31 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 07:19:31 INFO main: Logging handler created.
19/11/15 07:19:31 INFO start: Status HTTP server running at localhost:38121
19/11/15 07:19:31 INFO main: semi_persistent_directory: /tmp
19/11/15 07:19:31 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 07:19:31 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573802368.14_c341b911-d3c2-42ac-92fe-1f9053481b81', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 07:19:31 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573802368.14', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:45459'}
19/11/15 07:19:31 INFO __init__: Creating state cache with size 0
19/11/15 07:19:31 INFO __init__: Creating insecure control channel for localhost:37875.
19/11/15 07:19:31 INFO __init__: Control channel established.
19/11/15 07:19:31 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/15 07:19:31 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/15 07:19:31 INFO create_state_handler: Creating insecure state channel for localhost:46389.
19/11/15 07:19:31 INFO create_state_handler: State channel established.
19/11/15 07:19:31 INFO create_data_channel: Creating client data channel for localhost:42223
19/11/15 07:19:31 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 07:19:31 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 07:19:31 INFO run: No more requests from control plane
19/11/15 07:19:31 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 07:19:31 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 07:19:31 INFO close: Closing all cached grpc data channels.
19/11/15 07:19:31 INFO close: Closing all cached gRPC state handlers.
19/11/15 07:19:31 INFO run: Done consuming work.
19/11/15 07:19:31 INFO main: Python sdk harness exiting.
19/11/15 07:19:31 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 07:19:31 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 07:19:31 INFO Executor: Finished task 1.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/15 07:19:31 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 160, localhost, executor driver, partition 0, PROCESS_LOCAL, 7977 bytes)
19/11/15 07:19:31 INFO Executor: Running task 0.0 in stage 132.0 (TID 160)
19/11/15 07:19:31 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 159) in 877 ms on localhost (executor driver) (1/2)
19/11/15 07:19:31 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest_zW7vs/job_f60a38f8-496f-41bc-b6ad-2b4ee49df307/MANIFEST
19/11/15 07:19:31 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest_zW7vs/job_f60a38f8-496f-41bc-b6ad-2b4ee49df307/MANIFEST -> 0 artifacts
19/11/15 07:19:32 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 07:19:32 INFO main: Logging handler created.
19/11/15 07:19:32 INFO start: Status HTTP server running at localhost:38825
19/11/15 07:19:32 INFO main: semi_persistent_directory: /tmp
19/11/15 07:19:32 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 07:19:32 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573802368.14_c341b911-d3c2-42ac-92fe-1f9053481b81', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 07:19:32 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573802368.14', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:45459'}
19/11/15 07:19:32 INFO __init__: Creating state cache with size 0
19/11/15 07:19:32 INFO __init__: Creating insecure control channel for localhost:38677.
19/11/15 07:19:32 INFO __init__: Control channel established.
19/11/15 07:19:32 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/15 07:19:32 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/15 07:19:32 INFO create_state_handler: Creating insecure state channel for localhost:40797.
19/11/15 07:19:32 INFO create_state_handler: State channel established.
19/11/15 07:19:32 INFO create_data_channel: Creating client data channel for localhost:32883
19/11/15 07:19:32 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 07:19:32 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 07:19:32 INFO run: No more requests from control plane
19/11/15 07:19:32 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 07:19:32 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 07:19:32 INFO close: Closing all cached grpc data channels.
19/11/15 07:19:32 INFO close: Closing all cached gRPC state handlers.
19/11/15 07:19:32 INFO run: Done consuming work.
19/11/15 07:19:32 INFO main: Python sdk harness exiting.
19/11/15 07:19:32 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 07:19:32 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 07:19:32 INFO Executor: Finished task 0.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/15 07:19:32 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 160) in 828 ms on localhost (executor driver) (2/2)
19/11/15 07:19:32 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/15 07:19:32 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.709 s
19/11/15 07:19:32 INFO DAGScheduler: looking for newly runnable stages
19/11/15 07:19:32 INFO DAGScheduler: running: Set()
19/11/15 07:19:32 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/15 07:19:32 INFO DAGScheduler: failed: Set()
19/11/15 07:19:32 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/15 07:19:32 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/15 07:19:32 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.3 KB, free 13.5 GB)
19/11/15 07:19:32 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:43505 (size: 12.3 KB, free: 13.5 GB)
19/11/15 07:19:32 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/15 07:19:32 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/15 07:19:32 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/15 07:19:32 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/15 07:19:32 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/15 07:19:32 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/15 07:19:32 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/15 07:19:32 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest_zW7vs/job_f60a38f8-496f-41bc-b6ad-2b4ee49df307/MANIFEST
19/11/15 07:19:32 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest_zW7vs/job_f60a38f8-496f-41bc-b6ad-2b4ee49df307/MANIFEST -> 0 artifacts
19/11/15 07:19:33 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 07:19:33 INFO main: Logging handler created.
19/11/15 07:19:33 INFO start: Status HTTP server running at localhost:43479
19/11/15 07:19:33 INFO main: semi_persistent_directory: /tmp
19/11/15 07:19:33 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 07:19:33 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573802368.14_c341b911-d3c2-42ac-92fe-1f9053481b81', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 07:19:33 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573802368.14', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:45459'}
19/11/15 07:19:33 INFO __init__: Creating state cache with size 0
19/11/15 07:19:33 INFO __init__: Creating insecure control channel for localhost:37031.
19/11/15 07:19:33 INFO __init__: Control channel established.
19/11/15 07:19:33 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/15 07:19:33 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/15 07:19:33 INFO create_state_handler: Creating insecure state channel for localhost:45975.
19/11/15 07:19:33 INFO create_state_handler: State channel established.
19/11/15 07:19:33 INFO create_data_channel: Creating client data channel for localhost:44983
19/11/15 07:19:33 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 07:19:33 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 07:19:33 INFO run: No more requests from control plane
19/11/15 07:19:33 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 07:19:33 INFO close: Closing all cached grpc data channels.
19/11/15 07:19:33 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 07:19:33 INFO close: Closing all cached gRPC state handlers.
19/11/15 07:19:33 INFO run: Done consuming work.
19/11/15 07:19:33 INFO main: Python sdk harness exiting.
19/11/15 07:19:33 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 07:19:33 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 07:19:33 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/15 07:19:33 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 820 ms on localhost (executor driver) (1/1)
19/11/15 07:19:33 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/15 07:19:33 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.826 s
19/11/15 07:19:33 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.199225 s
19/11/15 07:19:33 INFO SparkPipelineRunner: Job test_windowing_1573802368.14_c341b911-d3c2-42ac-92fe-1f9053481b81 finished.
19/11/15 07:19:33 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/15 07:19:33 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktest_zW7vs/job_f60a38f8-496f-41bc-b6ad-2b4ee49df307/MANIFEST has 0 artifact locations
19/11/15 07:19:33 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktest_zW7vs/job_f60a38f8-496f-41bc-b6ad-2b4ee49df307/
INFO:root:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 229, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 420, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140326832428800)>

Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 326, in test_pardo_timers
# Thread: <Thread(Thread-119, started daemon 140326840821504)>

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_ru# Thread: <_MainThread(MainThread, started 140327972108032)>
==================== Timed out after 60 seconds. ====================

nner.py", line 420, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 140326815643392)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-125, started daemon 140326824036096)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-119, started daemon 140326840821504)>

# Thread: <_MainThread(MainThread, started 140327972108032)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)

# Thread: <Thread(wait_until_finish_read, started daemon 140326832428800)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 497, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 430, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1573802359.14_df8dc6ee-79f3-4176-8be2-d123ab6f56e4 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 303.220s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 28s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/fwix5a5tmaiqm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1539

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1539/display/redirect>

Changes:


------------------------------------------
[...truncated 1.67 MB...]
19/11/15 06:17:39 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/15 06:17:39 INFO DAGScheduler: failed: Set()
19/11/15 06:17:39 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/15 06:17:39 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.4 GB)
19/11/15 06:17:39 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.9 KB, free 13.4 GB)
19/11/15 06:17:39 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:33227 (size: 22.9 KB, free: 13.4 GB)
19/11/15 06:17:39 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/15 06:17:39 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/15 06:17:39 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/15 06:17:39 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 159, localhost, executor driver, partition 1, NODE_LOCAL, 7760 bytes)
19/11/15 06:17:39 INFO Executor: Running task 1.0 in stage 132.0 (TID 159)
19/11/15 06:17:39 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/15 06:17:39 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/15 06:17:39 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestkV159U/job_dadbeb51-0c2b-46a3-8442-e2643a056f14/MANIFEST
19/11/15 06:17:39 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestkV159U/job_dadbeb51-0c2b-46a3-8442-e2643a056f14/MANIFEST -> 0 artifacts
19/11/15 06:17:39 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 06:17:39 INFO main: Logging handler created.
19/11/15 06:17:39 INFO start: Status HTTP server running at localhost:43413
19/11/15 06:17:39 INFO main: semi_persistent_directory: /tmp
19/11/15 06:17:39 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 06:17:39 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573798656.25_8cba2af8-cfee-4d95-8a32-b65a1a944173', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 06:17:39 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573798656.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46471'}
19/11/15 06:17:39 INFO __init__: Creating state cache with size 0
19/11/15 06:17:39 INFO __init__: Creating insecure control channel for localhost:38757.
19/11/15 06:17:39 INFO __init__: Control channel established.
19/11/15 06:17:39 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/15 06:17:39 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/15 06:17:39 INFO create_state_handler: Creating insecure state channel for localhost:45277.
19/11/15 06:17:39 INFO create_state_handler: State channel established.
19/11/15 06:17:39 INFO create_data_channel: Creating client data channel for localhost:34861
19/11/15 06:17:39 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 06:17:39 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 06:17:39 INFO run: No more requests from control plane
19/11/15 06:17:39 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 06:17:39 INFO close: Closing all cached grpc data channels.
19/11/15 06:17:39 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 06:17:39 INFO close: Closing all cached gRPC state handlers.
19/11/15 06:17:39 INFO run: Done consuming work.
19/11/15 06:17:39 INFO main: Python sdk harness exiting.
19/11/15 06:17:39 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 06:17:39 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 06:17:39 INFO Executor: Finished task 1.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/15 06:17:39 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 160, localhost, executor driver, partition 0, PROCESS_LOCAL, 7977 bytes)
19/11/15 06:17:39 INFO Executor: Running task 0.0 in stage 132.0 (TID 160)
19/11/15 06:17:39 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 159) in 899 ms on localhost (executor driver) (1/2)
19/11/15 06:17:39 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestkV159U/job_dadbeb51-0c2b-46a3-8442-e2643a056f14/MANIFEST
19/11/15 06:17:39 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestkV159U/job_dadbeb51-0c2b-46a3-8442-e2643a056f14/MANIFEST -> 0 artifacts
19/11/15 06:17:40 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 06:17:40 INFO main: Logging handler created.
19/11/15 06:17:40 INFO start: Status HTTP server running at localhost:43189
19/11/15 06:17:40 INFO main: semi_persistent_directory: /tmp
19/11/15 06:17:40 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 06:17:40 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573798656.25_8cba2af8-cfee-4d95-8a32-b65a1a944173', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 06:17:40 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573798656.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46471'}
19/11/15 06:17:40 INFO __init__: Creating state cache with size 0
19/11/15 06:17:40 INFO __init__: Creating insecure control channel for localhost:45341.
19/11/15 06:17:40 INFO __init__: Control channel established.
19/11/15 06:17:40 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/15 06:17:40 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/15 06:17:40 INFO create_state_handler: Creating insecure state channel for localhost:37203.
19/11/15 06:17:40 INFO create_state_handler: State channel established.
19/11/15 06:17:40 INFO create_data_channel: Creating client data channel for localhost:34811
19/11/15 06:17:40 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 06:17:40 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 06:17:40 INFO run: No more requests from control plane
19/11/15 06:17:40 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 06:17:40 INFO close: Closing all cached grpc data channels.
19/11/15 06:17:40 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 06:17:40 INFO close: Closing all cached gRPC state handlers.
19/11/15 06:17:40 INFO run: Done consuming work.
19/11/15 06:17:40 INFO main: Python sdk harness exiting.
19/11/15 06:17:40 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 06:17:40 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 06:17:40 INFO Executor: Finished task 0.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/15 06:17:40 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 160) in 848 ms on localhost (executor driver) (2/2)
19/11/15 06:17:40 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/15 06:17:40 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.754 s
19/11/15 06:17:40 INFO DAGScheduler: looking for newly runnable stages
19/11/15 06:17:40 INFO DAGScheduler: running: Set()
19/11/15 06:17:40 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/15 06:17:40 INFO DAGScheduler: failed: Set()
19/11/15 06:17:40 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/15 06:17:40 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.4 GB)
19/11/15 06:17:40 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.3 KB, free 13.4 GB)
19/11/15 06:17:40 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:33227 (size: 12.3 KB, free: 13.4 GB)
19/11/15 06:17:40 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/15 06:17:40 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/15 06:17:40 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/15 06:17:40 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/15 06:17:40 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/15 06:17:40 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/15 06:17:40 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/15 06:17:40 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestkV159U/job_dadbeb51-0c2b-46a3-8442-e2643a056f14/MANIFEST
19/11/15 06:17:40 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestkV159U/job_dadbeb51-0c2b-46a3-8442-e2643a056f14/MANIFEST -> 0 artifacts
19/11/15 06:17:41 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 06:17:41 INFO main: Logging handler created.
19/11/15 06:17:41 INFO start: Status HTTP server running at localhost:37761
19/11/15 06:17:41 INFO main: semi_persistent_directory: /tmp
19/11/15 06:17:41 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 06:17:41 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573798656.25_8cba2af8-cfee-4d95-8a32-b65a1a944173', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 06:17:41 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573798656.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46471'}
19/11/15 06:17:41 INFO __init__: Creating state cache with size 0
19/11/15 06:17:41 INFO __init__: Creating insecure control channel for localhost:41829.
19/11/15 06:17:41 INFO __init__: Control channel established.
19/11/15 06:17:41 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/15 06:17:41 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/15 06:17:41 INFO create_state_handler: Creating insecure state channel for localhost:42117.
19/11/15 06:17:41 INFO create_state_handler: State channel established.
19/11/15 06:17:41 INFO create_data_channel: Creating client data channel for localhost:38525
19/11/15 06:17:41 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 06:17:41 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 06:17:41 INFO run: No more requests from control plane
19/11/15 06:17:41 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 06:17:41 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 06:17:41 INFO close: Closing all cached grpc data channels.
19/11/15 06:17:41 INFO close: Closing all cached gRPC state handlers.
19/11/15 06:17:41 INFO run: Done consuming work.
19/11/15 06:17:41 INFO main: Python sdk harness exiting.
19/11/15 06:17:41 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 06:17:41 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 06:17:41 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/15 06:17:41 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 858 ms on localhost (executor driver) (1/1)
19/11/15 06:17:41 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/15 06:17:41 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.864 s
19/11/15 06:17:41 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.379877 s
19/11/15 06:17:41 INFO SparkPipelineRunner: Job test_windowing_1573798656.25_8cba2af8-cfee-4d95-8a32-b65a1a944173 finished.
19/11/15 06:17:41 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/15 06:17:41 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestkV159U/job_dadbeb51-0c2b-46a3-8442-e2643a056f14/MANIFEST has 0 artifact locations
19/11/15 06:17:41 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestkV159U/job_dadbeb51-0c2b-46a3-8442-e2643a056f14/
INFO:root:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 229, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 420, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 326, in test_pardo_timers
    assert_that(actual, equal_to(expected))
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139893502101248)>

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 420, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-119, started daemon 139893510493952)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 139894503941888)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 139893493708544)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-125, started daemon 139893485315840)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <_MainThread(MainThread, started 139894503941888)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 497, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 430, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(Thread-119, started daemon 139893510493952)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1573798646.97_afd409c5-cb17-4564-9391-5e77332f3362 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <Thread(wait_until_finish_read, started daemon 139893502101248)>
----------------------------------------------------------------------
Ran 38 tests in 308.204s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 44s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/mxt35t2buzcvc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1538

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1538/display/redirect?page=changes>

Changes:

[robertwb] [BEAM-8657] Combiner lifting fix for bundle-based direct runner.


------------------------------------------
[...truncated 1.65 MB...]
19/11/15 05:36:33 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/15 05:36:33 INFO DAGScheduler: failed: Set()
19/11/15 05:36:33 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/15 05:36:33 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/15 05:36:33 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.0 KB, free 13.5 GB)
19/11/15 05:36:33 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:33591 (size: 22.0 KB, free: 13.5 GB)
19/11/15 05:36:33 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/15 05:36:33 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/15 05:36:33 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/15 05:36:33 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 159, localhost, executor driver, partition 0, NODE_LOCAL, 7760 bytes)
19/11/15 05:36:33 INFO Executor: Running task 0.0 in stage 132.0 (TID 159)
19/11/15 05:36:33 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/15 05:36:33 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/15 05:36:33 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestGxYr2b/job_f42e6581-1c68-4d09-9a8c-21ecc53e1cc4/MANIFEST
19/11/15 05:36:33 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestGxYr2b/job_f42e6581-1c68-4d09-9a8c-21ecc53e1cc4/MANIFEST -> 0 artifacts
19/11/15 05:36:34 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 05:36:34 INFO main: Logging handler created.
19/11/15 05:36:34 INFO start: Status HTTP server running at localhost:40235
19/11/15 05:36:34 INFO main: semi_persistent_directory: /tmp
19/11/15 05:36:34 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 05:36:34 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573796191.19_fa9c7858-f7a0-499d-8053-69fd478d2253', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 05:36:34 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573796191.19', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34903'}
19/11/15 05:36:34 INFO __init__: Creating state cache with size 0
19/11/15 05:36:34 INFO __init__: Creating insecure control channel for localhost:39987.
19/11/15 05:36:34 INFO __init__: Control channel established.
19/11/15 05:36:34 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/15 05:36:34 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/15 05:36:34 INFO create_state_handler: Creating insecure state channel for localhost:46031.
19/11/15 05:36:34 INFO create_state_handler: State channel established.
19/11/15 05:36:34 INFO create_data_channel: Creating client data channel for localhost:34957
19/11/15 05:36:34 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 05:36:34 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 05:36:34 INFO run: No more requests from control plane
19/11/15 05:36:34 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 05:36:34 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 05:36:34 INFO close: Closing all cached grpc data channels.
19/11/15 05:36:34 INFO close: Closing all cached gRPC state handlers.
19/11/15 05:36:34 INFO run: Done consuming work.
19/11/15 05:36:34 INFO main: Python sdk harness exiting.
19/11/15 05:36:34 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 05:36:34 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 05:36:34 INFO Executor: Finished task 0.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/15 05:36:34 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 160, localhost, executor driver, partition 1, PROCESS_LOCAL, 7977 bytes)
19/11/15 05:36:34 INFO Executor: Running task 1.0 in stage 132.0 (TID 160)
19/11/15 05:36:34 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 159) in 878 ms on localhost (executor driver) (1/2)
19/11/15 05:36:34 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestGxYr2b/job_f42e6581-1c68-4d09-9a8c-21ecc53e1cc4/MANIFEST
19/11/15 05:36:34 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestGxYr2b/job_f42e6581-1c68-4d09-9a8c-21ecc53e1cc4/MANIFEST -> 0 artifacts
19/11/15 05:36:35 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 05:36:35 INFO main: Logging handler created.
19/11/15 05:36:35 INFO start: Status HTTP server running at localhost:36117
19/11/15 05:36:35 INFO main: semi_persistent_directory: /tmp
19/11/15 05:36:35 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 05:36:35 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573796191.19_fa9c7858-f7a0-499d-8053-69fd478d2253', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 05:36:35 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573796191.19', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34903'}
19/11/15 05:36:35 INFO __init__: Creating state cache with size 0
19/11/15 05:36:35 INFO __init__: Creating insecure control channel for localhost:45289.
19/11/15 05:36:35 INFO __init__: Control channel established.
19/11/15 05:36:35 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/15 05:36:35 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/15 05:36:35 INFO create_state_handler: Creating insecure state channel for localhost:41801.
19/11/15 05:36:35 INFO create_state_handler: State channel established.
19/11/15 05:36:35 INFO create_data_channel: Creating client data channel for localhost:38205
19/11/15 05:36:35 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 05:36:35 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 05:36:35 INFO run: No more requests from control plane
19/11/15 05:36:35 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 05:36:35 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 05:36:35 INFO close: Closing all cached grpc data channels.
19/11/15 05:36:35 INFO close: Closing all cached gRPC state handlers.
19/11/15 05:36:35 INFO run: Done consuming work.
19/11/15 05:36:35 INFO main: Python sdk harness exiting.
19/11/15 05:36:35 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 05:36:35 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 05:36:35 INFO Executor: Finished task 1.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/15 05:36:35 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 160) in 845 ms on localhost (executor driver) (2/2)
19/11/15 05:36:35 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/15 05:36:35 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.728 s
19/11/15 05:36:35 INFO DAGScheduler: looking for newly runnable stages
19/11/15 05:36:35 INFO DAGScheduler: running: Set()
19/11/15 05:36:35 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/15 05:36:35 INFO DAGScheduler: failed: Set()
19/11/15 05:36:35 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/15 05:36:35 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/15 05:36:35 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.3 KB, free 13.5 GB)
19/11/15 05:36:35 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:33591 (size: 12.3 KB, free: 13.5 GB)
19/11/15 05:36:35 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/15 05:36:35 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/15 05:36:35 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/15 05:36:35 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/15 05:36:35 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/15 05:36:35 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/15 05:36:35 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/15 05:36:35 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestGxYr2b/job_f42e6581-1c68-4d09-9a8c-21ecc53e1cc4/MANIFEST
19/11/15 05:36:35 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestGxYr2b/job_f42e6581-1c68-4d09-9a8c-21ecc53e1cc4/MANIFEST -> 0 artifacts
19/11/15 05:36:36 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 05:36:36 INFO main: Logging handler created.
19/11/15 05:36:36 INFO start: Status HTTP server running at localhost:34711
19/11/15 05:36:36 INFO main: semi_persistent_directory: /tmp
19/11/15 05:36:36 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 05:36:36 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573796191.19_fa9c7858-f7a0-499d-8053-69fd478d2253', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 05:36:36 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573796191.19', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34903'}
19/11/15 05:36:36 INFO __init__: Creating state cache with size 0
19/11/15 05:36:36 INFO __init__: Creating insecure control channel for localhost:43101.
19/11/15 05:36:36 INFO __init__: Control channel established.
19/11/15 05:36:36 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/15 05:36:36 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/15 05:36:36 INFO create_state_handler: Creating insecure state channel for localhost:36499.
19/11/15 05:36:36 INFO create_state_handler: State channel established.
19/11/15 05:36:36 INFO create_data_channel: Creating client data channel for localhost:36833
19/11/15 05:36:36 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 05:36:36 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 05:36:36 INFO run: No more requests from control plane
19/11/15 05:36:36 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 05:36:36 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 05:36:36 INFO close: Closing all cached grpc data channels.
19/11/15 05:36:36 INFO close: Closing all cached gRPC state handlers.
19/11/15 05:36:36 INFO run: Done consuming work.
19/11/15 05:36:36 INFO main: Python sdk harness exiting.
19/11/15 05:36:36 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 05:36:36 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 05:36:36 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/15 05:36:36 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 851 ms on localhost (executor driver) (1/1)
19/11/15 05:36:36 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/15 05:36:36 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.857 s
19/11/15 05:36:36 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.267364 s
19/11/15 05:36:36 INFO SparkPipelineRunner: Job test_windowing_1573796191.19_fa9c7858-f7a0-499d-8053-69fd478d2253 finished.
19/11/15 05:36:36 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/15 05:36:36 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestGxYr2b/job_f42e6581-1c68-4d09-9a8c-21ecc53e1cc4/MANIFEST has 0 artifact locations
19/11/15 05:36:36 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestGxYr2b/job_f42e6581-1c68-4d09-9a8c-21ecc53e1cc4/
INFO:root:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 229, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 420, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140034616325888)>


======================================================================
# Thread: <Thread(Thread-117, started daemon 140034607933184)>

ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <_MainThread(MainThread, started 140035403949824)>
==================== Timed out after 60 seconds. ====================
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 326, in test_pardo_timers
    assert_that(actual, equal_to(expected))

# Thread: <Thread(wait_until_finish_read, started daemon 140034599278336)>

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(Thread-123, started daemon 140033986131712)>

  File "apache_beam/runners/portability/portable_runner.py", line 420, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-117, started daemon 140034607933184)>

# Thread: <_MainThread(MainThread, started 140035403949824)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140034616325888)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 497, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 430, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1573796180.83_f5ff4803-07a0-44ab-9b9a-9e2be9142cfa failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 320.653s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 12s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/gwdarl27ircbi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1537

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1537/display/redirect?page=changes>

Changes:

[chamikara] Fixes a failing check.


------------------------------------------
[...truncated 1.66 MB...]
19/11/15 02:29:42 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/15 02:29:42 INFO DAGScheduler: failed: Set()
19/11/15 02:29:42 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/15 02:29:42 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/15 02:29:42 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.0 KB, free 13.5 GB)
19/11/15 02:29:42 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:44551 (size: 22.0 KB, free: 13.5 GB)
19/11/15 02:29:42 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/15 02:29:42 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/15 02:29:42 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/15 02:29:42 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 159, localhost, executor driver, partition 0, NODE_LOCAL, 7760 bytes)
19/11/15 02:29:42 INFO Executor: Running task 0.0 in stage 132.0 (TID 159)
19/11/15 02:29:42 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/15 02:29:42 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/15 02:29:43 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestAfHgXb/job_da38cc36-0070-42cf-ae55-37f71f929f97/MANIFEST
19/11/15 02:29:43 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestAfHgXb/job_da38cc36-0070-42cf-ae55-37f71f929f97/MANIFEST -> 0 artifacts
19/11/15 02:29:43 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 02:29:43 INFO main: Logging handler created.
19/11/15 02:29:43 INFO start: Status HTTP server running at localhost:44747
19/11/15 02:29:43 INFO main: semi_persistent_directory: /tmp
19/11/15 02:29:43 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 02:29:43 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573784980.1_73172fe4-2f01-4aa2-ba28-2cf87cf6d5c9', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 02:29:43 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573784980.1', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53209'}
19/11/15 02:29:43 INFO __init__: Creating state cache with size 0
19/11/15 02:29:43 INFO __init__: Creating insecure control channel for localhost:39741.
19/11/15 02:29:43 INFO __init__: Control channel established.
19/11/15 02:29:43 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/15 02:29:43 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/15 02:29:43 INFO create_state_handler: Creating insecure state channel for localhost:36459.
19/11/15 02:29:43 INFO create_state_handler: State channel established.
19/11/15 02:29:43 INFO create_data_channel: Creating client data channel for localhost:46119
19/11/15 02:29:43 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 02:29:43 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 02:29:43 INFO run: No more requests from control plane
19/11/15 02:29:43 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 02:29:43 INFO close: Closing all cached grpc data channels.
19/11/15 02:29:43 INFO close: Closing all cached gRPC state handlers.
19/11/15 02:29:43 INFO run: Done consuming work.
19/11/15 02:29:43 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 02:29:43 INFO main: Python sdk harness exiting.
19/11/15 02:29:43 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 02:29:43 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 02:29:43 INFO Executor: Finished task 0.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/15 02:29:43 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 160, localhost, executor driver, partition 1, PROCESS_LOCAL, 7977 bytes)
19/11/15 02:29:43 INFO Executor: Running task 1.0 in stage 132.0 (TID 160)
19/11/15 02:29:43 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 159) in 908 ms on localhost (executor driver) (1/2)
19/11/15 02:29:43 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestAfHgXb/job_da38cc36-0070-42cf-ae55-37f71f929f97/MANIFEST
19/11/15 02:29:43 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestAfHgXb/job_da38cc36-0070-42cf-ae55-37f71f929f97/MANIFEST -> 0 artifacts
19/11/15 02:29:44 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 02:29:44 INFO main: Logging handler created.
19/11/15 02:29:44 INFO start: Status HTTP server running at localhost:44479
19/11/15 02:29:44 INFO main: semi_persistent_directory: /tmp
19/11/15 02:29:44 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 02:29:44 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573784980.1_73172fe4-2f01-4aa2-ba28-2cf87cf6d5c9', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 02:29:44 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573784980.1', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53209'}
19/11/15 02:29:44 INFO __init__: Creating state cache with size 0
19/11/15 02:29:44 INFO __init__: Creating insecure control channel for localhost:38667.
19/11/15 02:29:44 INFO __init__: Control channel established.
19/11/15 02:29:44 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/15 02:29:44 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/15 02:29:44 INFO create_state_handler: Creating insecure state channel for localhost:43963.
19/11/15 02:29:44 INFO create_state_handler: State channel established.
19/11/15 02:29:44 INFO create_data_channel: Creating client data channel for localhost:35881
19/11/15 02:29:44 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 02:29:44 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 02:29:44 INFO run: No more requests from control plane
19/11/15 02:29:44 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 02:29:44 INFO close: Closing all cached grpc data channels.
19/11/15 02:29:44 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 02:29:44 INFO close: Closing all cached gRPC state handlers.
19/11/15 02:29:44 INFO run: Done consuming work.
19/11/15 02:29:44 INFO main: Python sdk harness exiting.
19/11/15 02:29:44 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 02:29:44 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 02:29:44 INFO Executor: Finished task 1.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/15 02:29:44 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 160) in 889 ms on localhost (executor driver) (2/2)
19/11/15 02:29:44 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/15 02:29:44 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.802 s
19/11/15 02:29:44 INFO DAGScheduler: looking for newly runnable stages
19/11/15 02:29:44 INFO DAGScheduler: running: Set()
19/11/15 02:29:44 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/15 02:29:44 INFO DAGScheduler: failed: Set()
19/11/15 02:29:44 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/15 02:29:44 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/15 02:29:44 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.4 KB, free 13.5 GB)
19/11/15 02:29:44 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:44551 (size: 12.4 KB, free: 13.5 GB)
19/11/15 02:29:44 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/15 02:29:44 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/15 02:29:44 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/15 02:29:44 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/15 02:29:44 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/15 02:29:44 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/15 02:29:44 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/15 02:29:44 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestAfHgXb/job_da38cc36-0070-42cf-ae55-37f71f929f97/MANIFEST
19/11/15 02:29:44 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestAfHgXb/job_da38cc36-0070-42cf-ae55-37f71f929f97/MANIFEST -> 0 artifacts
19/11/15 02:29:45 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 02:29:45 INFO main: Logging handler created.
19/11/15 02:29:45 INFO start: Status HTTP server running at localhost:32827
19/11/15 02:29:45 INFO main: semi_persistent_directory: /tmp
19/11/15 02:29:45 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 02:29:45 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573784980.1_73172fe4-2f01-4aa2-ba28-2cf87cf6d5c9', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 02:29:45 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573784980.1', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53209'}
19/11/15 02:29:45 INFO __init__: Creating state cache with size 0
19/11/15 02:29:45 INFO __init__: Creating insecure control channel for localhost:43349.
19/11/15 02:29:45 INFO __init__: Control channel established.
19/11/15 02:29:45 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/15 02:29:45 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/15 02:29:45 INFO create_state_handler: Creating insecure state channel for localhost:41351.
19/11/15 02:29:45 INFO create_state_handler: State channel established.
19/11/15 02:29:45 INFO create_data_channel: Creating client data channel for localhost:44973
19/11/15 02:29:45 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 02:29:45 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 02:29:45 INFO run: No more requests from control plane
19/11/15 02:29:45 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 02:29:45 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 02:29:45 INFO close: Closing all cached grpc data channels.
19/11/15 02:29:45 INFO close: Closing all cached gRPC state handlers.
19/11/15 02:29:45 INFO run: Done consuming work.
19/11/15 02:29:45 INFO main: Python sdk harness exiting.
19/11/15 02:29:45 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 02:29:45 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 02:29:45 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/15 02:29:45 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 894 ms on localhost (executor driver) (1/1)
19/11/15 02:29:45 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/15 02:29:45 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.901 s
19/11/15 02:29:45 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.543963 s
19/11/15 02:29:45 INFO SparkPipelineRunner: Job test_windowing_1573784980.1_73172fe4-2f01-4aa2-ba28-2cf87cf6d5c9 finished.
19/11/15 02:29:45 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/15 02:29:45 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestAfHgXb/job_da38cc36-0070-42cf-ae55-37f71f929f97/MANIFEST has 0 artifact locations
19/11/15 02:29:45 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestAfHgXb/job_da38cc36-0070-42cf-ae55-37f71f929f97/
INFO:root:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 229, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 420, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 326, in test_pardo_timers
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140211958839040)>

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(Thread-119, started daemon 140212709332736)>

  File "apache_beam/runners/portability/portable_ru# Thread: <_MainThread(MainThread, started 140213224724224)>
nner.py", line 420, in wait_until_finish
    for state_response in self._state_stream:
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 140211940742912)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-125, started daemon 140211949397760)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-119, started daemon 140212709332736)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <_MainThread(MainThread, started 140213224724224)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(wait_until_finish_read, started daemon 140211958839040)>
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 497, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 430, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1573784970.59_17b8b622-95d1-4dfa-9c58-1c136d7d614a failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 304.069s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 50s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/u6tb3uayslipq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1536

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1536/display/redirect?page=changes>

Changes:

[robertwb] [BEAM-8657] Avoid lifting combiners for incompatible triggers.

[robertwb] [BEAM-8663] Respect PaneInfo in bundle based direct runner.

[robertwb] Comment on compatibility.

[robertwb] Cleanup: move direct runner test to correct location.


------------------------------------------
[...truncated 1.67 MB...]
19/11/15 02:06:32 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/15 02:06:32 INFO DAGScheduler: failed: Set()
19/11/15 02:06:32 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/15 02:06:32 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/15 02:06:32 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.9 KB, free 13.5 GB)
19/11/15 02:06:32 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:40479 (size: 22.9 KB, free: 13.5 GB)
19/11/15 02:06:32 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/15 02:06:32 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/15 02:06:32 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/15 02:06:32 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 159, localhost, executor driver, partition 1, NODE_LOCAL, 7760 bytes)
19/11/15 02:06:32 INFO Executor: Running task 1.0 in stage 132.0 (TID 159)
19/11/15 02:06:32 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/15 02:06:32 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/15 02:06:32 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestVspB5E/job_34d37c64-bda2-4ce0-bd12-dfe9a3d890ad/MANIFEST
19/11/15 02:06:32 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestVspB5E/job_34d37c64-bda2-4ce0-bd12-dfe9a3d890ad/MANIFEST -> 0 artifacts
19/11/15 02:06:33 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 02:06:33 INFO main: Logging handler created.
19/11/15 02:06:33 INFO start: Status HTTP server running at localhost:40967
19/11/15 02:06:33 INFO main: semi_persistent_directory: /tmp
19/11/15 02:06:33 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 02:06:33 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573783588.98_d3932084-4083-4635-b646-ca77471bfbe8', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 02:06:33 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573783588.98', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56233'}
19/11/15 02:06:33 INFO __init__: Creating state cache with size 0
19/11/15 02:06:33 INFO __init__: Creating insecure control channel for localhost:33253.
19/11/15 02:06:33 INFO __init__: Control channel established.
19/11/15 02:06:33 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/15 02:06:33 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/15 02:06:33 INFO create_state_handler: Creating insecure state channel for localhost:43167.
19/11/15 02:06:33 INFO create_state_handler: State channel established.
19/11/15 02:06:33 INFO create_data_channel: Creating client data channel for localhost:38497
19/11/15 02:06:33 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 02:06:33 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 02:06:33 INFO run: No more requests from control plane
19/11/15 02:06:33 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 02:06:33 INFO close: Closing all cached grpc data channels.
19/11/15 02:06:33 INFO close: Closing all cached gRPC state handlers.
19/11/15 02:06:33 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 02:06:33 INFO run: Done consuming work.
19/11/15 02:06:33 INFO main: Python sdk harness exiting.
19/11/15 02:06:33 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 02:06:33 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 02:06:33 INFO Executor: Finished task 1.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/15 02:06:33 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 160, localhost, executor driver, partition 0, PROCESS_LOCAL, 7977 bytes)
19/11/15 02:06:33 INFO Executor: Running task 0.0 in stage 132.0 (TID 160)
19/11/15 02:06:33 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 159) in 1426 ms on localhost (executor driver) (1/2)
19/11/15 02:06:33 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestVspB5E/job_34d37c64-bda2-4ce0-bd12-dfe9a3d890ad/MANIFEST
19/11/15 02:06:33 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestVspB5E/job_34d37c64-bda2-4ce0-bd12-dfe9a3d890ad/MANIFEST -> 0 artifacts
19/11/15 02:06:34 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 02:06:34 INFO main: Logging handler created.
19/11/15 02:06:34 INFO start: Status HTTP server running at localhost:33869
19/11/15 02:06:34 INFO main: semi_persistent_directory: /tmp
19/11/15 02:06:34 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 02:06:34 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573783588.98_d3932084-4083-4635-b646-ca77471bfbe8', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 02:06:34 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573783588.98', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56233'}
19/11/15 02:06:34 INFO __init__: Creating state cache with size 0
19/11/15 02:06:34 INFO __init__: Creating insecure control channel for localhost:42363.
19/11/15 02:06:34 INFO __init__: Control channel established.
19/11/15 02:06:34 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/15 02:06:34 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/15 02:06:34 INFO create_state_handler: Creating insecure state channel for localhost:43859.
19/11/15 02:06:34 INFO create_state_handler: State channel established.
19/11/15 02:06:34 INFO create_data_channel: Creating client data channel for localhost:42137
19/11/15 02:06:34 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 02:06:34 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 02:06:34 INFO run: No more requests from control plane
19/11/15 02:06:34 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 02:06:34 INFO close: Closing all cached grpc data channels.
19/11/15 02:06:34 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 02:06:34 INFO close: Closing all cached gRPC state handlers.
19/11/15 02:06:34 INFO run: Done consuming work.
19/11/15 02:06:34 INFO main: Python sdk harness exiting.
19/11/15 02:06:34 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 02:06:34 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 02:06:34 INFO Executor: Finished task 0.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/15 02:06:34 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 160) in 871 ms on localhost (executor driver) (2/2)
19/11/15 02:06:34 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/15 02:06:34 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 2.303 s
19/11/15 02:06:34 INFO DAGScheduler: looking for newly runnable stages
19/11/15 02:06:34 INFO DAGScheduler: running: Set()
19/11/15 02:06:34 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/15 02:06:34 INFO DAGScheduler: failed: Set()
19/11/15 02:06:34 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/15 02:06:34 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/15 02:06:34 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.3 KB, free 13.5 GB)
19/11/15 02:06:34 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:40479 (size: 12.3 KB, free: 13.5 GB)
19/11/15 02:06:34 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/15 02:06:34 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/15 02:06:34 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/15 02:06:34 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/15 02:06:34 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/15 02:06:34 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/15 02:06:34 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/15 02:06:34 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestVspB5E/job_34d37c64-bda2-4ce0-bd12-dfe9a3d890ad/MANIFEST
19/11/15 02:06:34 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestVspB5E/job_34d37c64-bda2-4ce0-bd12-dfe9a3d890ad/MANIFEST -> 0 artifacts
19/11/15 02:06:35 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 02:06:35 INFO main: Logging handler created.
19/11/15 02:06:35 INFO start: Status HTTP server running at localhost:39519
19/11/15 02:06:35 INFO main: semi_persistent_directory: /tmp
19/11/15 02:06:35 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 02:06:35 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573783588.98_d3932084-4083-4635-b646-ca77471bfbe8', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 02:06:35 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573783588.98', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56233'}
19/11/15 02:06:35 INFO __init__: Creating state cache with size 0
19/11/15 02:06:35 INFO __init__: Creating insecure control channel for localhost:35953.
19/11/15 02:06:35 INFO __init__: Control channel established.
19/11/15 02:06:35 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/15 02:06:35 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/15 02:06:35 INFO create_state_handler: Creating insecure state channel for localhost:36169.
19/11/15 02:06:35 INFO create_state_handler: State channel established.
19/11/15 02:06:35 INFO create_data_channel: Creating client data channel for localhost:32965
19/11/15 02:06:35 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 02:06:35 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 02:06:35 INFO run: No more requests from control plane
19/11/15 02:06:35 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 02:06:35 INFO close: Closing all cached grpc data channels.
19/11/15 02:06:35 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 02:06:35 INFO close: Closing all cached gRPC state handlers.
19/11/15 02:06:35 INFO run: Done consuming work.
19/11/15 02:06:35 INFO main: Python sdk harness exiting.
19/11/15 02:06:35 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 02:06:35 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 02:06:35 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/15 02:06:35 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 875 ms on localhost (executor driver) (1/1)
19/11/15 02:06:35 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/15 02:06:35 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.881 s
19/11/15 02:06:35 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 5.456989 s
19/11/15 02:06:35 INFO SparkPipelineRunner: Job test_windowing_1573783588.98_d3932084-4083-4635-b646-ca77471bfbe8 finished.
19/11/15 02:06:35 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/15 02:06:35 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestVspB5E/job_34d37c64-bda2-4ce0-bd12-dfe9a3d890ad/MANIFEST has 0 artifact locations
19/11/15 02:06:35 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestVspB5E/job_34d37c64-bda2-4ce0-bd12-dfe9a3d890ad/
INFO:root:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 229, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 420, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 326, in test_pardo_timers
    assert_that(actual, equal_to(expected))
# Thread: <Thread(wait_until_finish_read, started daemon 139636634547968)>

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(Thread-118, started daemon 139637389117184)>

  File "apache_beam/runners/portability/portable_runner.py", line 420, in wait_until_finish
# Thread: <_MainThread(MainThread, started 139637904508672)>
==================== Timed out after 60 seconds. ====================

    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 139636626155264)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-124, started daemon 139636617762560)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
# Thread: <Thread(Thread-118, started daemon 139637389117184)>

    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 139637904508672)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
# Thread: <Thread(wait_until_finish_read, started daemon 139636634547968)>
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 497, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 430, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1573783579.6_18ec4f0d-e509-45d1-9585-c58e7ad72a42 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 321.153s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 49s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/lljhefw5lazlw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1535

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1535/display/redirect?page=changes>

Changes:

[pabloem] [BEAM-8661] Moving io module to have per-module logger

[pabloem] Fix lint

[kcweaver] [BEAM-8660] Override returned artifact staging endpoint

[robertwb] [BEAM-8667] Bound the number of element bundles buffered off the data

[pabloem] [BEAM-8661] Moving other modules to have per-module loggers

[pabloem] Fix lint

[pabloem] Removing extra line between constants

[pabloem] Removing extra space between constants


------------------------------------------
[...truncated 1.66 MB...]
19/11/15 00:43:45 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/15 00:43:45 INFO DAGScheduler: failed: Set()
19/11/15 00:43:45 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/15 00:43:45 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/15 00:43:45 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.9 KB, free 13.5 GB)
19/11/15 00:43:45 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:41699 (size: 22.9 KB, free: 13.5 GB)
19/11/15 00:43:45 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/15 00:43:45 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/15 00:43:45 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/15 00:43:45 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 159, localhost, executor driver, partition 1, NODE_LOCAL, 7760 bytes)
19/11/15 00:43:45 INFO Executor: Running task 1.0 in stage 132.0 (TID 159)
19/11/15 00:43:45 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/15 00:43:45 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/15 00:43:45 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestX1oNsa/job_7d7c144c-d6fc-4db7-84a6-aae5ecc048c7/MANIFEST
19/11/15 00:43:45 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestX1oNsa/job_7d7c144c-d6fc-4db7-84a6-aae5ecc048c7/MANIFEST -> 0 artifacts
19/11/15 00:43:46 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 00:43:46 INFO main: Logging handler created.
19/11/15 00:43:46 INFO start: Status HTTP server running at localhost:35273
19/11/15 00:43:46 INFO main: semi_persistent_directory: /tmp
19/11/15 00:43:46 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 00:43:46 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573778622.83_214eb8b6-263f-4a22-a3be-150dcc10cfdc', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 00:43:46 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573778622.83', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37411'}
19/11/15 00:43:46 INFO __init__: Creating state cache with size 0
19/11/15 00:43:46 INFO __init__: Creating insecure control channel for localhost:44157.
19/11/15 00:43:46 INFO __init__: Control channel established.
19/11/15 00:43:46 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/15 00:43:46 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/15 00:43:46 INFO create_state_handler: Creating insecure state channel for localhost:34607.
19/11/15 00:43:46 INFO create_state_handler: State channel established.
19/11/15 00:43:46 INFO create_data_channel: Creating client data channel for localhost:44725
19/11/15 00:43:46 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 00:43:46 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 00:43:46 INFO run: No more requests from control plane
19/11/15 00:43:46 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 00:43:46 INFO close: Closing all cached grpc data channels.
19/11/15 00:43:46 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 00:43:46 INFO close: Closing all cached gRPC state handlers.
19/11/15 00:43:46 INFO run: Done consuming work.
19/11/15 00:43:46 INFO main: Python sdk harness exiting.
19/11/15 00:43:46 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 00:43:46 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 00:43:46 INFO Executor: Finished task 1.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/15 00:43:46 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 160, localhost, executor driver, partition 0, PROCESS_LOCAL, 7977 bytes)
19/11/15 00:43:46 INFO Executor: Running task 0.0 in stage 132.0 (TID 160)
19/11/15 00:43:46 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 159) in 930 ms on localhost (executor driver) (1/2)
19/11/15 00:43:46 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestX1oNsa/job_7d7c144c-d6fc-4db7-84a6-aae5ecc048c7/MANIFEST
19/11/15 00:43:46 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestX1oNsa/job_7d7c144c-d6fc-4db7-84a6-aae5ecc048c7/MANIFEST -> 0 artifacts
19/11/15 00:43:47 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 00:43:47 INFO main: Logging handler created.
19/11/15 00:43:47 INFO start: Status HTTP server running at localhost:43225
19/11/15 00:43:47 INFO main: semi_persistent_directory: /tmp
19/11/15 00:43:47 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 00:43:47 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573778622.83_214eb8b6-263f-4a22-a3be-150dcc10cfdc', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 00:43:47 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573778622.83', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37411'}
19/11/15 00:43:47 INFO __init__: Creating state cache with size 0
19/11/15 00:43:47 INFO __init__: Creating insecure control channel for localhost:36317.
19/11/15 00:43:47 INFO __init__: Control channel established.
19/11/15 00:43:47 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/15 00:43:47 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/15 00:43:47 INFO create_state_handler: Creating insecure state channel for localhost:42869.
19/11/15 00:43:47 INFO create_state_handler: State channel established.
19/11/15 00:43:47 INFO create_data_channel: Creating client data channel for localhost:44051
19/11/15 00:43:47 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 00:43:47 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 00:43:47 INFO run: No more requests from control plane
19/11/15 00:43:47 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 00:43:47 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 00:43:47 INFO close: Closing all cached grpc data channels.
19/11/15 00:43:47 INFO close: Closing all cached gRPC state handlers.
19/11/15 00:43:47 INFO run: Done consuming work.
19/11/15 00:43:47 INFO main: Python sdk harness exiting.
19/11/15 00:43:47 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 00:43:47 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 00:43:47 INFO Executor: Finished task 0.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/15 00:43:47 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 160) in 820 ms on localhost (executor driver) (2/2)
19/11/15 00:43:47 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/15 00:43:47 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.757 s
19/11/15 00:43:47 INFO DAGScheduler: looking for newly runnable stages
19/11/15 00:43:47 INFO DAGScheduler: running: Set()
19/11/15 00:43:47 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/15 00:43:47 INFO DAGScheduler: failed: Set()
19/11/15 00:43:47 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/15 00:43:47 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/15 00:43:47 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.3 KB, free 13.5 GB)
19/11/15 00:43:47 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:41699 (size: 12.3 KB, free: 13.5 GB)
19/11/15 00:43:47 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/15 00:43:47 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/15 00:43:47 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/15 00:43:47 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/15 00:43:47 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/15 00:43:47 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/15 00:43:47 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/15 00:43:47 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestX1oNsa/job_7d7c144c-d6fc-4db7-84a6-aae5ecc048c7/MANIFEST
19/11/15 00:43:47 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestX1oNsa/job_7d7c144c-d6fc-4db7-84a6-aae5ecc048c7/MANIFEST -> 0 artifacts
19/11/15 00:43:48 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/15 00:43:48 INFO main: Logging handler created.
19/11/15 00:43:48 INFO start: Status HTTP server running at localhost:46347
19/11/15 00:43:48 INFO main: semi_persistent_directory: /tmp
19/11/15 00:43:48 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/15 00:43:48 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573778622.83_214eb8b6-263f-4a22-a3be-150dcc10cfdc', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/15 00:43:48 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573778622.83', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37411'}
19/11/15 00:43:48 INFO __init__: Creating state cache with size 0
19/11/15 00:43:48 INFO __init__: Creating insecure control channel for localhost:34415.
19/11/15 00:43:48 INFO __init__: Control channel established.
19/11/15 00:43:48 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/15 00:43:48 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/15 00:43:48 INFO create_state_handler: Creating insecure state channel for localhost:42439.
19/11/15 00:43:48 INFO create_state_handler: State channel established.
19/11/15 00:43:48 INFO create_data_channel: Creating client data channel for localhost:40571
19/11/15 00:43:48 INFO GrpcDataService: Beam Fn Data client connected.
19/11/15 00:43:48 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/15 00:43:48 INFO run: No more requests from control plane
19/11/15 00:43:48 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/15 00:43:48 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 00:43:48 INFO close: Closing all cached grpc data channels.
19/11/15 00:43:48 INFO close: Closing all cached gRPC state handlers.
19/11/15 00:43:48 INFO run: Done consuming work.
19/11/15 00:43:48 INFO main: Python sdk harness exiting.
19/11/15 00:43:48 INFO GrpcLoggingService: Logging client hanged up.
19/11/15 00:43:48 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/15 00:43:48 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/15 00:43:48 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 842 ms on localhost (executor driver) (1/1)
19/11/15 00:43:48 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/15 00:43:48 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.848 s
19/11/15 00:43:48 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.328857 s
19/11/15 00:43:48 INFO SparkPipelineRunner: Job test_windowing_1573778622.83_214eb8b6-263f-4a22-a3be-150dcc10cfdc finished.
19/11/15 00:43:48 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/15 00:43:48 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktestX1oNsa/job_7d7c144c-d6fc-4db7-84a6-aae5ecc048c7/MANIFEST has 0 artifact locations
19/11/15 00:43:48 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestX1oNsa/job_7d7c144c-d6fc-4db7-84a6-aae5ecc048c7/
INFO:root:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 229, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 420, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140382052890368)>

    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
# Thread: <Thread(Thread-118, started daemon 140382061283072)>

    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140382840514304)>
==================== Timed out after 60 seconds. ====================

======================================================================
# Thread: <Thread(wait_until_finish_read, started daemon 140382026925824)>

# Thread: <Thread(Thread-122, started daemon 140382035318528)>

ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-118, started daemon 140382061283072)>

# Thread: <Thread(wait_until_finish_read, started daemon 140382052890368)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 326, in test_pardo_timers
    assert_that(actual, equal_to(expected))
# Thread: <_MainThread(MainThread, started 140382840514304)>
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 420, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 497, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 430, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1573778613.48_c4be31a8-3ba6-44e2-847e-167d6cdc5b78 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 311.942s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 4s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/3sqzftwjjqvxa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1534

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1534/display/redirect?page=changes>

Changes:

[wenjialiu] [BEAM-8575] Fix window assignment idempotency tests non-deterministic

[lvwanqi11] [BEAM-8666] Remove dependency between DataflowRunner and PortableRunner


------------------------------------------
[...truncated 1.65 MB...]
19/11/14 23:21:55 INFO main: Logging handler created.
19/11/14 23:21:55 INFO start: Status HTTP server running at localhost:35377
19/11/14 23:21:55 INFO main: semi_persistent_directory: /tmp
19/11/14 23:21:55 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/14 23:21:55 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573773713.3_147b132a-2896-4edb-8ed9-6a90672cb230', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/14 23:21:55 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573773713.3', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40535'}
19/11/14 23:21:55 INFO __init__: Creating state cache with size 0
19/11/14 23:21:55 INFO __init__: Creating insecure control channel for localhost:36593.
19/11/14 23:21:55 INFO __init__: Control channel established.
19/11/14 23:21:55 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/14 23:21:55 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/14 23:21:55 INFO create_state_handler: Creating insecure state channel for localhost:35155.
19/11/14 23:21:55 INFO create_state_handler: State channel established.
19/11/14 23:21:55 INFO create_data_channel: Creating client data channel for localhost:42105
19/11/14 23:21:55 INFO GrpcDataService: Beam Fn Data client connected.
19/11/14 23:21:55 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/14 23:21:55 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
19/11/14 23:21:55 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/14 23:21:55 INFO run: No more requests from control plane
19/11/14 23:21:55 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/14 23:21:55 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/14 23:21:55 INFO close: Closing all cached grpc data channels.
19/11/14 23:21:55 INFO close: Closing all cached gRPC state handlers.
19/11/14 23:21:55 INFO run: Done consuming work.
19/11/14 23:21:55 INFO main: Python sdk harness exiting.
19/11/14 23:21:55 INFO GrpcLoggingService: Logging client hanged up.
19/11/14 23:21:55 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/14 23:21:55 INFO Executor: Finished task 0.0 in stage 131.0 (TID 158). 12763 bytes result sent to driver
19/11/14 23:21:55 INFO TaskSetManager: Finished task 0.0 in stage 131.0 (TID 158) in 847 ms on localhost (executor driver) (1/1)
19/11/14 23:21:55 INFO TaskSchedulerImpl: Removed TaskSet 131.0, whose tasks have all completed, from pool 
19/11/14 23:21:55 INFO DAGScheduler: ShuffleMapStage 131 (mapToPair at GroupCombineFunctions.java:55) finished in 0.852 s
19/11/14 23:21:55 INFO DAGScheduler: looking for newly runnable stages
19/11/14 23:21:55 INFO DAGScheduler: running: Set()
19/11/14 23:21:55 INFO DAGScheduler: waiting: Set(ShuffleMapStage 132, ResultStage 133)
19/11/14 23:21:55 INFO DAGScheduler: failed: Set()
19/11/14 23:21:55 INFO DAGScheduler: Submitting ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/11/14 23:21:55 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 57.2 KB, free 13.5 GB)
19/11/14 23:21:55 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 22.9 KB, free 13.5 GB)
19/11/14 23:21:55 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:46367 (size: 22.9 KB, free: 13.5 GB)
19/11/14 23:21:55 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161
19/11/14 23:21:55 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 132 (MapPartitionsRDD[911] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1))
19/11/14 23:21:55 INFO TaskSchedulerImpl: Adding task set 132.0 with 2 tasks
19/11/14 23:21:55 INFO TaskSetManager: Starting task 1.0 in stage 132.0 (TID 159, localhost, executor driver, partition 1, NODE_LOCAL, 7760 bytes)
19/11/14 23:21:55 INFO Executor: Running task 1.0 in stage 132.0 (TID 159)
19/11/14 23:21:55 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
19/11/14 23:21:55 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
19/11/14 23:21:55 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest6rg9nH/job_f1b336fb-5203-4677-8438-fc77f1e3abd7/MANIFEST
19/11/14 23:21:55 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest6rg9nH/job_f1b336fb-5203-4677-8438-fc77f1e3abd7/MANIFEST -> 0 artifacts
19/11/14 23:21:56 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/14 23:21:56 INFO main: Logging handler created.
19/11/14 23:21:56 INFO start: Status HTTP server running at localhost:45757
19/11/14 23:21:56 INFO main: semi_persistent_directory: /tmp
19/11/14 23:21:56 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/14 23:21:56 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573773713.3_147b132a-2896-4edb-8ed9-6a90672cb230', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/14 23:21:56 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573773713.3', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40535'}
19/11/14 23:21:56 INFO __init__: Creating state cache with size 0
19/11/14 23:21:56 INFO __init__: Creating insecure control channel for localhost:45279.
19/11/14 23:21:56 INFO __init__: Control channel established.
19/11/14 23:21:56 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/14 23:21:56 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/14 23:21:56 INFO create_state_handler: Creating insecure state channel for localhost:34431.
19/11/14 23:21:56 INFO create_state_handler: State channel established.
19/11/14 23:21:56 INFO create_data_channel: Creating client data channel for localhost:43673
19/11/14 23:21:56 INFO GrpcDataService: Beam Fn Data client connected.
19/11/14 23:21:56 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/14 23:21:56 INFO run: No more requests from control plane
19/11/14 23:21:56 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/14 23:21:56 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/14 23:21:56 INFO close: Closing all cached grpc data channels.
19/11/14 23:21:56 INFO close: Closing all cached gRPC state handlers.
19/11/14 23:21:56 INFO run: Done consuming work.
19/11/14 23:21:56 INFO main: Python sdk harness exiting.
19/11/14 23:21:56 INFO GrpcLoggingService: Logging client hanged up.
19/11/14 23:21:56 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/14 23:21:56 INFO Executor: Finished task 1.0 in stage 132.0 (TID 159). 15229 bytes result sent to driver
19/11/14 23:21:56 INFO TaskSetManager: Starting task 0.0 in stage 132.0 (TID 160, localhost, executor driver, partition 0, PROCESS_LOCAL, 7977 bytes)
19/11/14 23:21:56 INFO Executor: Running task 0.0 in stage 132.0 (TID 160)
19/11/14 23:21:56 INFO TaskSetManager: Finished task 1.0 in stage 132.0 (TID 159) in 896 ms on localhost (executor driver) (1/2)
19/11/14 23:21:56 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest6rg9nH/job_f1b336fb-5203-4677-8438-fc77f1e3abd7/MANIFEST
19/11/14 23:21:56 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest6rg9nH/job_f1b336fb-5203-4677-8438-fc77f1e3abd7/MANIFEST -> 0 artifacts
19/11/14 23:21:57 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/14 23:21:57 INFO main: Logging handler created.
19/11/14 23:21:57 INFO start: Status HTTP server running at localhost:37807
19/11/14 23:21:57 INFO main: semi_persistent_directory: /tmp
19/11/14 23:21:57 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/14 23:21:57 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573773713.3_147b132a-2896-4edb-8ed9-6a90672cb230', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/14 23:21:57 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573773713.3', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40535'}
19/11/14 23:21:57 INFO __init__: Creating state cache with size 0
19/11/14 23:21:57 INFO __init__: Creating insecure control channel for localhost:36855.
19/11/14 23:21:57 INFO __init__: Control channel established.
19/11/14 23:21:57 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/14 23:21:57 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/14 23:21:57 INFO create_state_handler: Creating insecure state channel for localhost:37983.
19/11/14 23:21:57 INFO create_state_handler: State channel established.
19/11/14 23:21:57 INFO create_data_channel: Creating client data channel for localhost:41755
19/11/14 23:21:57 INFO GrpcDataService: Beam Fn Data client connected.
19/11/14 23:21:57 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/14 23:21:57 INFO run: No more requests from control plane
19/11/14 23:21:57 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/14 23:21:57 INFO close: Closing all cached grpc data channels.
19/11/14 23:21:57 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/14 23:21:57 INFO close: Closing all cached gRPC state handlers.
19/11/14 23:21:57 INFO run: Done consuming work.
19/11/14 23:21:57 INFO main: Python sdk harness exiting.
19/11/14 23:21:57 INFO GrpcLoggingService: Logging client hanged up.
19/11/14 23:21:57 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/14 23:21:57 INFO Executor: Finished task 0.0 in stage 132.0 (TID 160). 13710 bytes result sent to driver
19/11/14 23:21:57 INFO TaskSetManager: Finished task 0.0 in stage 132.0 (TID 160) in 826 ms on localhost (executor driver) (2/2)
19/11/14 23:21:57 INFO TaskSchedulerImpl: Removed TaskSet 132.0, whose tasks have all completed, from pool 
19/11/14 23:21:57 INFO DAGScheduler: ShuffleMapStage 132 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 1.727 s
19/11/14 23:21:57 INFO DAGScheduler: looking for newly runnable stages
19/11/14 23:21:57 INFO DAGScheduler: running: Set()
19/11/14 23:21:57 INFO DAGScheduler: waiting: Set(ResultStage 133)
19/11/14 23:21:57 INFO DAGScheduler: failed: Set()
19/11/14 23:21:57 INFO DAGScheduler: Submitting ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/14 23:21:57 INFO MemoryStore: Block broadcast_130 stored as values in memory (estimated size 26.1 KB, free 13.5 GB)
19/11/14 23:21:57 INFO MemoryStore: Block broadcast_130_piece0 stored as bytes in memory (estimated size 12.4 KB, free 13.5 GB)
19/11/14 23:21:57 INFO BlockManagerInfo: Added broadcast_130_piece0 in memory on localhost:46367 (size: 12.4 KB, free: 13.5 GB)
19/11/14 23:21:57 INFO SparkContext: Created broadcast 130 from broadcast at DAGScheduler.scala:1161
19/11/14 23:21:57 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 133 (EmptyOutputSink_0 MapPartitionsRDD[916] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0))
19/11/14 23:21:57 INFO TaskSchedulerImpl: Adding task set 133.0 with 1 tasks
19/11/14 23:21:57 INFO TaskSetManager: Starting task 0.0 in stage 133.0 (TID 161, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/14 23:21:57 INFO Executor: Running task 0.0 in stage 133.0 (TID 161)
19/11/14 23:21:57 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
19/11/14 23:21:57 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
19/11/14 23:21:57 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest6rg9nH/job_f1b336fb-5203-4677-8438-fc77f1e3abd7/MANIFEST
19/11/14 23:21:57 INFO AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest6rg9nH/job_f1b336fb-5203-4677-8438-fc77f1e3abd7/MANIFEST -> 0 artifacts
19/11/14 23:21:58 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/11/14 23:21:58 INFO main: Logging handler created.
19/11/14 23:21:58 INFO start: Status HTTP server running at localhost:36157
19/11/14 23:21:58 INFO main: semi_persistent_directory: /tmp
19/11/14 23:21:58 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/14 23:21:58 WARN get_all_options: Discarding unparseable args: [u'--job_server_timeout=60', u'--app_name=test_windowing_1573773713.3_147b132a-2896-4edb-8ed9-6a90672cb230', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 
19/11/14 23:21:58 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1573773713.3', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40535'}
19/11/14 23:21:58 INFO __init__: Creating state cache with size 0
19/11/14 23:21:58 INFO __init__: Creating insecure control channel for localhost:44141.
19/11/14 23:21:58 INFO __init__: Control channel established.
19/11/14 23:21:58 INFO __init__: Initializing SDKHarness with 12 workers.
19/11/14 23:21:58 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/14 23:21:58 INFO create_state_handler: Creating insecure state channel for localhost:42497.
19/11/14 23:21:58 INFO create_state_handler: State channel established.
19/11/14 23:21:58 INFO create_data_channel: Creating client data channel for localhost:35739
19/11/14 23:21:58 INFO GrpcDataService: Beam Fn Data client connected.
19/11/14 23:21:58 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/14 23:21:58 INFO run: No more requests from control plane
19/11/14 23:21:58 INFO run: SDK Harness waiting for in-flight requests to complete
19/11/14 23:21:58 INFO close: Closing all cached grpc data channels.
19/11/14 23:21:58 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/14 23:21:58 INFO close: Closing all cached gRPC state handlers.
19/11/14 23:21:58 INFO run: Done consuming work.
19/11/14 23:21:58 INFO main: Python sdk harness exiting.
19/11/14 23:21:58 INFO GrpcLoggingService: Logging client hanged up.
19/11/14 23:21:58 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/14 23:21:58 INFO Executor: Finished task 0.0 in stage 133.0 (TID 161). 11970 bytes result sent to driver
19/11/14 23:21:58 INFO TaskSetManager: Finished task 0.0 in stage 133.0 (TID 161) in 872 ms on localhost (executor driver) (1/1)
19/11/14 23:21:58 INFO TaskSchedulerImpl: Removed TaskSet 133.0, whose tasks have all completed, from pool 
19/11/14 23:21:58 INFO DAGScheduler: ResultStage 133 (foreach at BoundedDataset.java:124) finished in 0.877 s
19/11/14 23:21:58 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 4.234342 s
19/11/14 23:21:58 INFO SparkPipelineRunner: Job test_windowing_1573773713.3_147b132a-2896-4edb-8ed9-6a90672cb230 finished.
19/11/14 23:21:58 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/14 23:21:58 INFO AbstractArtifactRetrievalService: Manifest at /tmp/sparktest6rg9nH/job_f1b336fb-5203-4677-8438-fc77f1e3abd7/MANIFEST has 0 artifact locations
19/11/14 23:21:58 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktest6rg9nH/job_f1b336fb-5203-4677-8438-fc77f1e3abd7/
INFO:root:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 229, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 417, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 73, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
==================== Timed out after 60 seconds. ====================

ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140343172314880)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 497, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()

# Thread: <Thread(Thread-118, started daemon 140343189100288)>

  File "apache_beam/runners/portability/portable_runner.py", line 427, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <_MainThread(MainThread, started 140343968331520)>
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1573773704.48_7a8c5e02-92bd-4afa-880d-19421f3b1ae9 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 285.884s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 198

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 20s
59 actionable tasks: 46 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/ztom6h2aovwvo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org